{"id":20267,"date":"2025-08-06T12:17:44","date_gmt":"2025-08-06T04:17:44","guid":{"rendered":"https:\/\/aicats.wiki\/sites\/20267.html"},"modified":"2025-08-06T12:17:57","modified_gmt":"2025-08-06T04:17:57","slug":"openbmb","status":"publish","type":"sites","link":"https:\/\/aicats.wiki\/en\/sites\/20267-html","title":{"rendered":"OpenBMB"},"content":{"rendered":"<p><strong>OpenBMB<\/strong>It was jointly supported by Tsinghua University, the Academy of Artificial Intelligence, and others.<a href=\"https:\/\/aicats.wiki\/en\/2025\/07\/02\/5583-html\/\" title=\"A collection of recommended open-source AI platforms: 5 AI tools you should definitely try in 2025.\">AI open source community<\/a>With the mission of &quot;bringing large language models to every household,&quot; the platform focuses on the democratization of large language models and related ecosystem tools, providing...<strong>Complete AI training model toolchain<\/strong>This lowers the barriers to research and application of AI projects.<strong>OpenBMB<\/strong>It plays an important role in the open sharing of many AI training tools and promotes the popularization of AI innovation.<\/p>\n\n\n\n<p><em>source:<\/em><a href=\"https:\/\/www.openbmb.cn\/home\" title=\"\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >OpenBMB Official Website<\/a><\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p>In today&#039;s era of rapid development of AI technology<strong>Large Language Model (LLM)<\/strong>And related AI training model platforms have become an important driving force for industrial innovation. This is supported by prestigious academic institutions such as Tsinghua University and the Academy of Artificial Intelligence.<strong>OpenBMB<\/strong>With the mission of &quot;bringing large-scale AI models to every household,&quot; OpenBMB is promoting the democratization, standardization, and practical application of AI large-scale models. This article will take the form of a news report to provide a detailed introduction to the OpenBMB platform, covering its background, functions, pricing, usage methods, and target audience. Lists, tables, and authoritative links will be interspersed throughout to help readers efficiently understand this representative platform of China&#039;s original AI open-source ecosystem.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Introduction to OpenBMB and Platform Vision<\/h2>\n\n\n\n<p><strong>OpenBMB (Open Lab for Big Model Base)<\/strong>As one of the most influential AI open-source communities in China, it focuses on<strong>Large-scale pre-trained language model library<\/strong>The development of infrastructure and related ecosystem tools. OpenBMB was initiated by the Natural Language Processing Laboratory and the Academy of Artificial Intelligence of Tsinghua University, aiming to lower the barriers to training, deploying and applying large-scale models with hundreds of billions of data points, and accelerate the integration of AI large-scale model technology with real-world scenarios.<\/p>\n\n\n\n<p>\u636e<a href=\"https:\/\/www.openbmb.cn\/home\" title=\"\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >OpenBMB Official Website<\/a>The introduction states that the team behind it is Chinese.<strong>Pioneers in the field of natural language processing and pre-trained model research<\/strong>The platform has released well-known models such as CPM-1, CPM-2, CPM-3, and ERNIE at top global conferences, and has laid a solid foundation in areas such as knowledge-guided pre-training, efficient parameter fine-tuning, and model compression. Based on open source, community, and standardization, the platform provides developers, researchers, enterprises, and the general public with a one-stop AI large-scale model infrastructure.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img fetchpriority=\"high\" decoding=\"async\" width=\"1593\" height=\"914\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/08\/image-162.png\" alt=\"OpenBMB Official Website\" class=\"wp-image-22598\"\/><figcaption class=\"wp-element-caption\">Photo\/<a href=\"https:\/\/www.openbmb.cn\/home\" title=\"\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >OpenBMB Official Website<\/a><\/figcaption><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Main functions of OpenBMB<\/h2>\n\n\n\n<p>OpenBMB provides<strong>Complete AI training model toolchain<\/strong>It covers multiple stages of model development, including training, compression, inference, fine-tuning, and Prompt engineering, greatly reducing the barriers to AI project research and application. The following is a list and table summarizing the core functional modules of OpenBMB:<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Feature list<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>BMTr<a class=\"external\" href=\"https:\/\/aicats.wiki\/en\/sitetag\/ai\" title=\"View articles related to ai\" target=\"_blank\">ai<\/a>n<\/strong>A highly efficient engine for pre-training and fine-tuning large models, with training costs as low as 90% compared to similar frameworks.<\/li>\n\n\n\n<li><strong>BMCook<\/strong>A model compression library combining multiple algorithms, ensuring performance while accelerating inference by more than 10 times.<\/li>\n\n\n\n<li><strong>BMInf<\/strong>A low-cost, large-model inference solution that can efficiently run models with billions of parameters using ordinary graphics cards costing around a thousand yuan.<\/li>\n\n\n\n<li><strong>OpenPrompt<\/strong>A unified interface for Prompt learning template language drives innovation in prompting engineering.<\/li>\n\n\n\n<li><strong>OpenDelta<\/strong>Parameters can be finely adjusted efficiently, and it supports collaboration with Prompt. Fine-tuning parameters less than 5% can achieve the effect of full adjustment.<\/li>\n\n\n\n<li><strong>ModelCenter<\/strong>A comprehensive model repository that supports rapid fine-tuning and distributed training of mainstream AI training models.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Function Comparison and Featured Tables<\/h3>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><tbody><tr><th>Tool Name<\/th><th>Core Uses<\/th><th>Technical features<\/th><th>Typical application scenarios<\/th><\/tr><tr><td>BMTrain<\/td><td>Training\/Fine-tuning<\/td><td>Distributed, efficient, low-cost, and supports extremely large models.<\/td><td>LLM and AI training model development<\/td><\/tr><tr><td>BMCook<\/td><td>compression<\/td><td>Quantitative methods, pruning, distillation, and specialization can be combined freely.<\/td><td>Edge devices\/compression deployment<\/td><\/tr><tr><td>BMInf<\/td><td>reasoning<\/td><td>Low-power inference, supported by general-purpose graphics cards<\/td><td>Popularization of AI applications and terminal deployment<\/td><\/tr><tr><td>OpenPrompt<\/td><td>Prompt project<\/td><td>Modular, easy to integrate, and adaptable to various tasks<\/td><td>Tips for learning, tips for engineering<\/td><\/tr><tr><td>OpenDelta<\/td><td>Lightweight adjustment<\/td><td>Fewer parameters, but the effect can be fully fine-tuned.<\/td><td>Rapidly customize AI for enterprise\/professional scenarios<\/td><\/tr><tr><td>ModelCenter<\/td><td>Model repository<\/td><td>High-efficiency fine-tuning, distributed support, multi-language and multi-tasking<\/td><td>AI training model selection and transfer learning<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p>Check<a href=\"https:\/\/www.openbmb.org\/home\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >OpenBMB Main Features Details<\/a>\u3002<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1593\" height=\"914\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/08\/image-163.png\" alt=\"openBMB project\" class=\"wp-image-22633\"\/><figcaption class=\"wp-element-caption\">Image\/openBMB project<\/figcaption><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">OpenBMB Pricing &amp; Plans<\/h2>\n\n\n\n<p>As an open source community<strong>Most of OpenBMB&#039;s core resources are freely available to the public.<\/strong>The platform encourages developer participation, corporate collaboration, and academic cooperation to promote the popularization of AI innovation. Specifically, OpenBMB opens up its various tools in the following ways:<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><tbody><tr><th>Product\/Function<\/th><th>Authorization Statement<\/th><th>Commercial Use Policy<\/th><th>Open source platform entry<\/th><\/tr><tr><td>Tools such as BMTrain and BMCook<\/td><td>Open source and free, standard license<\/td><td>Support deep customization for enterprises<\/td><td><a href=\"https:\/\/github.com\/OpenBMB\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >GitHub OpenBMB<\/a><\/td><\/tr><tr><td>Pre-trained models such as MiniCPM<\/td><td>Free commercial license, source must be credited.<\/td><td>Must comply with open source license<\/td><td><a href=\"https:\/\/www.openbmb.org\/models\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Model release page<\/a><\/td><\/tr><tr><td>API and Professional Support Services<\/td><td>It often needs to be discussed based on actual needs.<\/td><td>Enterprise version requires customization<\/td><td>Official website<a href=\"https:\/\/www.openbmb.org\/home\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Cooperation portal<\/a><\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p><strong>Notice<\/strong>For large-scale enterprise-level deployment or technical support, it is recommended to contact us through the &quot;Cooperation&quot; channel on the official website.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">How to use OpenBMB<\/h2>\n\n\n\n<p>OpenBMB is designed to be highly modular.<strong>Supports flexible, on-premises or cloud deployment.<\/strong>Users can efficiently perform AI training\/inference\/compression tasks by following these steps:<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Brief description of the usage process<\/h3>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Visit the official website to obtain resources<\/strong><br>Log in directly<a href=\"https:\/\/www.openbmb.org\/home\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >OpenBMB Official Website<\/a>\u6216<a href=\"https:\/\/github.com\/OpenBMB\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >GitHub project homepage<\/a>Browse tools and documents.<\/li>\n\n\n\n<li><strong>Download toolkits or models<\/strong><br>Choose BMTrain\/BMCook\/BMInf, etc., as needed, or...<a href=\"https:\/\/www.openbmb.org\/models\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >ModelCenter<\/a>Retrieval model.<\/li>\n\n\n\n<li><strong>Consult the developer documentation<\/strong><br>Detailed API, deployment guidelines, and compatibility information are available at [website\/platform name].<a href=\"https:\/\/www.openbmb.org\/docs\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Technical Documentation Center<\/a>All are explained.<\/li>\n\n\n\n<li><strong>Local\/Cloud Environment Configuration<\/strong><br>Only a commonly used deep learning framework environment, such as PyTorch, is required, and some inference modules can run on low-end graphics cards.<\/li>\n\n\n\n<li><strong>Parameter tuning and training\/inference<\/strong><br>Based on the actual scenario, customize the model structure, prompt templates, etc., to improve the effect of AI training model.<\/li>\n\n\n\n<li><strong>Community interaction and technical support<\/strong><br>Participation is possible<a href=\"https:\/\/www.openbmb.org\/community\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >forum<\/a>Get support through discussion forums and code issues.<\/li>\n<\/ol>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1593\" height=\"914\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/08\/image-164.png\" alt=\"GitHub project homepage\" class=\"wp-image-22639\"\/><figcaption class=\"wp-element-caption\">Photo\/<a href=\"https:\/\/github.com\/OpenBMB\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >GitHub project homepage<\/a><\/figcaption><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Who is OpenBMB suitable for?<\/h2>\n\n\n\n<p><strong>OpenBMB<\/strong>Targeting multiple levels of the AI ecosystem, and widely adaptable to the following users:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>AI developers\/researchers<\/strong><br>Gain access to cutting-edge AI training model libraries and validation platforms to support paper innovation and empirical verification.<\/li>\n\n\n\n<li><strong>Enterprise Technology Team<\/strong><br>Enterprise-level large model customization, compressed deployment, and local inference requirements.<\/li>\n\n\n\n<li><strong>Universities and teaching institutions<\/strong><br>Used for course practice, innovative experiments, and talent cultivation.<\/li>\n\n\n\n<li><strong>AI beginners and enthusiasts<\/strong><br>Experience large-scale technology with low barriers to entry and participate in open-source collaborations.<\/li>\n\n\n\n<li><strong>Open source community and non-profit organizations<\/strong><br>Build an AI ecosystem together and promote the democratization and universal access to technology.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">OpenBMB Large Language Model and Representative AI Training Model<\/h2>\n\n\n\n<p>The OpenBMB platform has released several pre-trained models with industry influence. The table below extracts some representative results and provides parameter scales and licensing information for reference.<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><tbody><tr><th>Model Name<\/th><th>Parameter size<\/th><th>Release time<\/th><th>Commercial Licensing<\/th><th>Model Page<\/th><\/tr><tr><td>MiniCPM-V 2.0<\/td><td>2.8 billion<\/td><td>2024-04<\/td><td>Free for commercial use<\/td><td><a href=\"https:\/\/www.openbmb.org\/models\/MiniCPM-V-2.0\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >MiniCPM-V-2.0<\/a><\/td><\/tr><tr><td>MiniCPM-MoE-8x2B<\/td><td>13.6 billion<\/td><td>2024-04<\/td><td>Free for commercial use<\/td><td><a href=\"https:\/\/www.openbmb.org\/models\/MiniCPM-MoE-8x2B\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >MiniCPM-MoE-8x2B<\/a><\/td><\/tr><tr><td>CPM-Bee<\/td><td>10 billion<\/td><td>2023<\/td><td>Free for commercial use<\/td><td><a href=\"https:\/\/www.openbmb.org\/models\/CPM-Bee\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >CPM-Bee<\/a><\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p><strong>Note:<\/strong>All model training data, evaluation benchmarks, and performance comparisons are available at [website\/platform name].<a href=\"https:\/\/www.openbmb.org\/models\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Model Library Page<\/a>Detailed instructions are provided.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1593\" height=\"914\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/08\/image-165.png\" alt=\"Model Overview\" class=\"wp-image-22642\"\/><figcaption class=\"wp-element-caption\">Image\/Model Overview<\/figcaption><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Typical application scenario examples<\/h2>\n\n\n\n<p><strong>OpenBMB<\/strong>Its AI training models have had a significant impact in the following areas:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Intelligent dialogue systems and large-scale question-answering robots<\/li>\n\n\n\n<li>Text generation and content creation assistance<\/li>\n\n\n\n<li>Cross-agent collaboration (multimodal AI, automatic labeling, etc.)<\/li>\n\n\n\n<li>Industry knowledge management and the construction of large-scale enterprise knowledge bases<\/li>\n\n\n\n<li>Low-power AI model embedding in edge\/mobile devices<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Open source community ecosystem and cooperation models<\/h2>\n\n\n\n<p>OpenBMB consistently adheres to the &quot;open source + collaborative development&quot; model, actively participating in the construction of cutting-edge global AI communities. Its collaboration methods primarily include:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>GitHub Open Source Collaboration<\/strong>\uff08<a href=\"https:\/\/github.com\/OpenBMB\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Portal<\/a>): Code contributions, issue resolutions, and pull request feedback.<\/li>\n\n\n\n<li><strong>Community activities\/live courses<\/strong>It includes lectures, workshops, and online paper sharing.<\/li>\n\n\n\n<li><strong>Customized R&amp;D for Enterprises\/Organizations<\/strong>: Tailor-made large-scale model training solutions for specific industries and commercial companies.<\/li>\n\n\n\n<li><strong>Academic\/Research Collaborative Innovation<\/strong>Collaborate with top universities and research institutions at home and abroad to promote methodological breakthroughs.<\/li>\n<\/ul>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1593\" height=\"914\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/08\/image-166.png\" alt=\"Official Guide\" class=\"wp-image-22645\"\/><figcaption class=\"wp-element-caption\">Image\/Official Guide<\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Frequently Asked Questions<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">1. Is OpenBMB free for enterprises? In which scenarios is a license required?<\/h3>\n\n\n\n<p><strong>answer:<\/strong>Most of OpenBMB&#039;s core tools and pre-trained models are free for commercial use (source must be acknowledged and licenses must be followed). However, for in-depth customization, large-scale enterprise integration, or professional operation and maintenance services, it is recommended to obtain prior authorization through [unclear - possibly a third-party platform].<a href=\"https:\/\/www.openbmb.org\/home\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Cooperation portal<\/a>Contact the official channels to discuss authorization details.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">2. Which mainstream hardware and cloud service platforms does OpenBMB support?<\/h3>\n\n\n\n<p><strong>answer:<\/strong>The OpenBMB toolchain is highly compatible with mainstream GPUs (such as NVIDIA series), x86\/ARM architecture servers, and mainstream cloud computing platforms, including Alibaba Cloud and Tencent Cloud. The inference module BMINF can run efficiently on mid-range graphics cards (such as GTX 1060) and supports hybrid cloud and local deployment.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">3. What are the differences and advantages of OpenBMB compared to mainstream international LLM platforms?<\/h3>\n\n\n\n<p><strong>answer:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Domestically produced and controllable<\/strong>OpenBMB is entirely developed by local academic and industry collaborations, providing compliant large-scale AI models and training tools.<\/li>\n\n\n\n<li><strong>Active open source ecosystem<\/strong>It boasts a large number of local Chinese contributors and deep localization adaptations, resulting in rapid community feedback.<\/li>\n\n\n\n<li><strong>Frontiers of Technological Innovation<\/strong>It has made several industry firsts in the fields of compression, fine-tuning, and Prompt engineering.<\/li>\n\n\n\n<li><strong>Business\/Public Welfare Friendly<\/strong>Free commercial licenses are flexible and suitable for all types of entities to use immediately.<\/li>\n<\/ul>\n\n\n\n<p>For more technical comparisons, please see [link\/reference].<a href=\"https:\/\/www.openbmb.org\/docs\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >OpenBMB Technical Documentation Section<\/a>\u3002<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p>OpenBMB is setting an example in the AI open-source community in China and around the world, pushing the innovation and popularization of AI training models to new heights. Whether for scientific research, education, innovation, or industrial intelligence, OpenBMB provides developers and users with an open, advanced, and low-barrier large-scale model infrastructure, helping more enterprises and innovators embrace the intelligent era.<a href=\"https:\/\/www.openbmb.org\/home\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Visit the OpenBMB official website now<\/a>This marks the beginning of a journey to empower large-scale models.<\/p>","protected":false},"author":3,"comment_status":"open","ping_status":"closed","template":"","meta":{"_crsspst_to_aicatswiki":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0},"content_visibility":[262],"sitetag":[17,822,802,857,821,687,820],"favorites":[577],"class_list":{"0":"post-20267","1":"sites","2":"type-sites","3":"status-publish","4":"hentry","5":"sitetag-ai","12":"favorites-ai-models"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/sites\/20267","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/sites"}],"about":[{"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/types\/sites"}],"author":[{"embeddable":true,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/comments?post=20267"}],"version-history":[{"count":2,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/sites\/20267\/revisions"}],"predecessor-version":[{"id":22650,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/sites\/20267\/revisions\/22650"}],"wp:attachment":[{"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/media?parent=20267"}],"wp:term":[{"taxonomy":"content_visibility","embeddable":true,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/content_visibility?post=20267"},{"taxonomy":"sitetag","embeddable":true,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/sitetag?post=20267"},{"taxonomy":"favorites","embeddable":true,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/favorites?post=20267"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}