{"id":19271,"date":"2025-07-26T12:02:45","date_gmt":"2025-07-26T04:02:45","guid":{"rendered":"https:\/\/aicats.wiki\/sites\/19271.html"},"modified":"2025-07-26T12:02:52","modified_gmt":"2025-07-26T04:02:52","slug":"bloom","status":"publish","type":"sites","link":"https:\/\/aicats.wiki\/en\/sites\/19271-html","title":{"rendered":"BLOOM"},"content":{"rendered":"<p><strong>BLOOM: Initiating Large-Scale...<a href=\"https:\/\/aicats.wiki\/en\/2025\/06\/26\/3610.html\/\" title=\"What is Roboflow? This article will give you a quick overview of the powerful features of this AI vision data annotation and model training platform.\">Multilingual open-source AI training models<\/a>The new era<\/strong>BLOOM is a generative large-scale language model jointly developed by hundreds of AI researchers worldwide. It features a massive number of parameters, multilingual coverage, and openness, supporting 46 natural languages and 13 programming languages. The release of BLOOM symbolizes freedom and openness in AI research, and its powerful generative capabilities and applicability have garnered significant attention in the industry.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">BLOOM&#039;s main functions<\/h2>\n\n\n\n<p>BLOOM is a<strong>Autoregressive Generative Large Language Model<\/strong>Employing a transformer architecture, it boasts a staggering 176 billion parameters and supports 46 natural languages and 13 programming languages. This AI training model was developed and trained by the BigScience Workshop on the Jean Zay supercomputer in France, aiming to promote a transparent, reusable, and open AI research ecosystem. Its advantages include:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Multilingual support<\/strong>It covers English, French, Chinese, Hindi, Arabic, and other languages.<\/li>\n\n\n\n<li><strong>Powerful generation capabilities<\/strong>It can generate coherent, human-like text based on user prompts.<\/li>\n\n\n\n<li><strong>Downstream task migration<\/strong>It is easy to fine-tune for NLP tasks such as summarizing, question answering, translation, and information extraction.<\/li>\n\n\n\n<li><strong>Programming language compatibility<\/strong>It performs well in mainstream programming languages such as Python, Java, and C++.<\/li>\n\n\n\n<li><strong>Fully open source\/downloadable<\/strong>Anyone can access<a href=\"https:\/\/huggingface.co\/bigscience\/bloom\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Hugging Face<\/a>Get it for free and deploy it.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/07\/my_prefix_1753488616.png\" alt=\"Screenshot from BLOOM&#039;s official website\" class=\"wp-image-51824\"\/><figcaption class=\"wp-element-caption\">Photo\/<a href=\"https:\/\/huggingface.co\/bigscience\/bloom\" title=\"\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Screenshot from BLOOM&#039;s official website<\/a><\/figcaption><\/figure>\n\n\n\n<p>For example, BLOOM can easily achieve the following functions (Source:<a href=\"https:\/\/huggingface.co\/docs\/transformers\/model_doc\/bloom\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Function page link<\/a>\uff09\uff1a<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Functional type<\/th><th>illustrate<\/th><\/tr><\/thead><tbody><tr><td>Text generation<\/td><td>Continue writing, dialogue, short essay creation<\/td><\/tr><tr><td>Abstract\/Information Extraction<\/td><td>Automatically generate text summaries and extract key information<\/td><\/tr><tr><td>Code completion<\/td><td>Code completion and generation for multiple programming languages<\/td><\/tr><tr><td>Semantic understanding<\/td><td>In some formats, it can handle reading comprehension and question answering.<\/td><\/tr><tr><td>Multilingual translation<\/td><td>Supports multilingual translation (not for professional MTs, but can be used for demos and experiments).<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">BLOOM&#039;s Data Diversity Statistics<\/h3>\n\n\n\n<p>BLOOM used a highly diverse corpus during its AI training process, specifically including the following table:<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/07\/my_prefix_1753488621.png\" alt=\"BLOOM text generation example\" class=\"wp-image-51824\"\/><figcaption class=\"wp-element-caption\">Image\/BLOOM text generation example<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Language or type<\/th><th>Quantity\/Proportion<\/th><\/tr><\/thead><tbody><tr><td>Natural Language<\/td><td>46<\/td><\/tr><tr><td>programming language<\/td><td>13<\/td><\/tr><tr><td>Preprocessed text size<\/td><td>1.6 TB<\/td><\/tr><tr><td>Number of training tokens<\/td><td>350 billion (350B)<\/td><\/tr><tr><td>Supported maximum text length<\/td><td>2048 Token<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p>For more model details, please see<a href=\"https:\/\/huggingface.co\/docs\/transformers\/model_doc\/bloom\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Official Hugging Face documentation<\/a>\u3002<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img fetchpriority=\"high\" decoding=\"async\" width=\"1593\" height=\"894\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/07\/image-601.png\" alt=\"Official Q&amp;A page\" class=\"wp-image-19337\"\/><figcaption class=\"wp-element-caption\">Image\/Official Q&amp;A Page<\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">BLOOM&#039;s pricing and plans<\/h2>\n\n\n\n<p><strong>As an open-source model, BLOOM&#039;s basic model is completely free, and anyone can download and deploy it locally via Hugging Face without paying any licensing fees.<\/strong><\/p>\n\n\n\n<p>The release of BLOOM follows the BigScience RAIL license, allowing individuals, research institutions, and social groups to use and modify it free of charge, but it must be explicitly prohibited from use in scenarios that violate ethics and laws. If cloud-based inference services, customized APIs, or enterprise-grade deployments are used, Hugging Face may offer separate paid options, but these are value-added services provided by cloud vendors and platform providers and do not conflict with the open-source nature of the BLOOM model itself.<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Version\/Service Category<\/th><th>Price\/License<\/th><th>Access methods<\/th><\/tr><\/thead><tbody><tr><td>BLOOM full model<\/td><td>Free, RAIL open source license<\/td><td><a href=\"https:\/\/huggingface.co\/bigscience\/bloom\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Official website<\/a><\/td><\/tr><tr><td>API Cloud Inference<\/td><td>Based on Hugging Face prices<\/td><td><a href=\"https:\/\/huggingface.co\/inference-api\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >API Page<\/a><\/td><\/tr><tr><td>Local deployment<\/td><td>free<\/td><td>Hardware resources must be provided by the user.<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p>For more pricing and deployment details, please visit [website address].<a href=\"https:\/\/huggingface.co\/pricing\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Hugging Face pricing page<\/a>\u3002<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/07\/my_prefix_1753488634.png\" alt=\"Model training data diversity statistics chart\" class=\"wp-image-51824\"\/><figcaption class=\"wp-element-caption\">Figure \/ Statistical chart of diversity in model training data<\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">How to use BLOOM<\/h2>\n\n\n\n<p>BLOOM is designed to be &quot;useful out of the box,&quot; supporting multi-platform and multi-framework calls. Developers can use it in the following ways:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Direct download weights and tokenizers<\/strong>It can be loaded and used locally using PyTorch\/Transformers.<\/li>\n\n\n\n<li>Direct cloud-based inference via the Hugging Face API (registration and API key required).<\/li>\n\n\n\n<li>It supports fine-tuning\/transfer learning to meet specific business needs.<\/li>\n<\/ol>\n\n\n\n<p><strong>Quick Use Example<\/strong>(See)<a href=\"https:\/\/huggingface.co\/docs\/transformers\/model_doc\/bloom\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Official Quick Start Guide<\/a>\uff09\uff1a<\/p>\n\n\n\n<!--wp-compress-html--><!--wp-compress-html no compression-->\n<pre class=\"wp-block-code\"><code>from transformers import AutoModelForCausalLM, AutoTokenizer tokenizer = AutoTokenizer.from_pretr<a class=\"external\" href=\"https:\/\/aicats.wiki\/en\/sitetag\/ai\" title=\"View articles related to ai\" target=\"_blank\">ai<\/a>ned(&quot;bigscience\/bloom&quot;) model = AutoModelForCausalLM.from_pretrained(&quot;bigscience\/bloom&quot;) prompt = &quot;Please briefly introduce the main functions of the BLOOM model.&quot; inputs = tokenizer(prompt, return_tensors=&quot;pt&quot;) outputs = model.generate(**inputs, max_new_tokens=100) print(tokenizer.decode(outputs[0], skip_special_tokens=True))<\/code><\/pre>\n<!--wp-compress-html no compression--><!--wp-compress-html-->\n\n\n\n<p>If you only need a small-scale trial, there are also interactive web demos available in Hugging Face Spaces.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1593\" height=\"894\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/07\/image-602.jpg\" alt=\"Paid plans\" class=\"wp-image-19338\"\/><figcaption class=\"wp-element-caption\">Image\/Paid Plan<\/figcaption><\/figure>\n\n\n\n<p><strong>Hardware Requirements Specification Table:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>BLOOM parameter scale<\/th><th>Best Hardware Recommendations<\/th><\/tr><\/thead><tbody><tr><td>176B Parameter Full Version<\/td><td>Multiple A100 GPUs\/Enterprise Servers<\/td><\/tr><tr><td>Lightweight versions such as 7B\/3B\/1B<\/td><td>A single high-end GPU is sufficient<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Who is BLOOM suitable for?<\/h2>\n\n\n\n<p>BLOOM is positioned as &quot;open source, cutting-edge technology&quot;, and is therefore suitable for the following groups:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Academic researchers and university faculty and students<\/strong>It can be used for NLP research, result reproduction, model fine-tuning, etc.<\/li>\n\n\n\n<li><strong>AI developers and engineers<\/strong>Integrate into product prototypes and conduct AI capability verification.<\/li>\n\n\n\n<li><strong>Multilingual application developers<\/strong>: Serving multinational corporations or linguistically diverse user groups.<\/li>\n\n\n\n<li><strong>Data Scientist<\/strong>Used for specific registers, domain knowledge extraction, and custom tasks.<\/li>\n\n\n\n<li><strong>Open source community contributors<\/strong>This includes model optimization, evaluation, and development of supporting tools.<\/li>\n\n\n\n<li><strong>Programming Education and Automation Tool Developers<\/strong>Experiment with AI code generation\/completion functionality.<\/li>\n<\/ul>\n\n\n\n<p><strong>Application Instance Scope Reference Table:<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Application type<\/th><th>Demonstrative value<\/th><\/tr><\/thead><tbody><tr><td>Multilingual document generation\/summarization<\/td><td>Automatic synthesis of multilingual information<\/td><\/tr><tr><td>Question answering and chatbots<\/td><td>Build an assistant that supports multiple languages<\/td><\/tr><tr><td>Code understanding and completion<\/td><td>Support for subject-specific programming assistance<\/td><\/tr><tr><td>Cross-language content creation<\/td><td>Global User Content Automation<\/td><\/tr><tr><td>Low-resource language research<\/td><td>Promoting the protection of language diversity<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1593\" height=\"894\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/07\/image-602.png\" alt=\"Official documentation\" class=\"wp-image-19339\"\/><figcaption class=\"wp-element-caption\">Photo\/<a href=\"https:\/\/huggingface.co\/docs\/transformers\/model_doc\/bloom\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Official documentation<\/a><\/figcaption><\/figure>\n\n\n\n<p>For detailed information on suitable users and operating suggestions, please refer to [link\/reference].<a href=\"https:\/\/huggingface.co\/docs\/transformers\/model_doc\/bloom\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Official documentation<\/a>\u3002<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Highlights of BLOOM Technical Architecture and AI Training Model<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Model architecture features<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>It adopts a decoder-only structure, similar to GPT-3.<\/strong>However, it covers more languages and has better transfer and generalization capabilities.<\/li>\n\n\n\n<li><strong>The number of parameters is up to 176 bytes, and it supports sequence lengths of up to 2048 tokens.<\/strong>The aim is to achieve a wider range of semantic understanding and generation.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Summary of architecture parameters<\/th><th>Configuration\/Instructions<\/th><\/tr><\/thead><tbody><tr><td>number of floors<\/td><td>70<\/td><\/tr><tr><td>Number of attention heads<\/td><td>112<\/td><\/tr><tr><td>Hidden layer dimensions<\/td><td>14336<\/td><\/tr><tr><td>vocabulary size<\/td><td>250,680<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p>refer to:<a href=\"https:\/\/huggingface.co\/docs\/transformers\/model_doc\/bloom\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >More technical details<\/a><\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Diversity and fairness of AI training models<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Data coverage is extensive<\/strong>It includes 46 natural languages, 13 programming languages, and 1.6TB of high-quality text.<\/li>\n\n\n\n<li><strong>Strong diversity design principle<\/strong>It emphasizes proportional sampling of low-resource languages and stresses &quot;open source, openness, and inclusivity&quot;.<\/li>\n\n\n\n<li><strong>Model versions vary<\/strong>In addition to the full version with 176B parameters, lightweight versions such as 7B1 and 3B are also available for users with limited resources.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">BLOOM&#039;s Risks, Limitations, and Usage Recommendations<\/h2>\n\n\n\n<p><strong>Limitations and risks must be faced squarely:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>Non-high-risk decision-making tools<\/strong>The model&#039;s content &quot;seems reliable, but its accuracy needs to be verified,&quot; and it is not suitable for direct decision-making in scenarios such as biomedicine, finance, and law.<\/li>\n\n\n\n<li><strong>May output harmful content<\/strong>Such as biased, aggressive, or using sensitive words.<\/li>\n\n\n\n<li><strong>Strict adherence to ethical and data compliance is required.<\/strong>: Comply with the RAIL protocol and do not abuse it.<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Main risk types<\/th><th>Detailed Explanation<\/th><\/tr><\/thead><tbody><tr><td>Bias\/Data Imbalance<\/td><td>Information about certain groups appears at different frequencies<\/td><\/tr><tr><td>Personal information leakage<\/td><td>The training data may contain sensitive content.<\/td><\/tr><tr><td>Error message generated<\/td><td>The generated content is not a 100% fact.<\/td><\/tr><tr><td>Inappropriate use<\/td><td>Automatic evaluation of individuals and key decision-making scenarios are prohibited.<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1593\" height=\"894\" src=\"https:\/\/aicats.wiki\/wp-content\/uploads\/2025\/07\/image-603.png\" alt=\"Risk and Limitations Statement Document\" class=\"wp-image-19342\"\/><figcaption class=\"wp-element-caption\">Photo\/<a href=\"https:\/\/huggingface.co\/bigscience\/bloom#risks-and-limitations\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Risk and Limitations Statement Document<\/a><\/figcaption><\/figure>\n\n\n\n<p>See details:<a href=\"https:\/\/huggingface.co\/bigscience\/bloom#risks-and-limitations\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Risk and Limitations Statement Document<\/a>\u3002<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">BLOOM Frequently Asked Questions<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">What are the different versions of the BLOOM model, and how do you choose between them?<\/h3>\n\n\n\n<p>BLOOM offers multiple parameter levels, ranging from micro-scale (bloom-560m) to ultra-large scale (bloom-176B).<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>With limited hardware resources, we recommend choosing lightweight versions such as 7B or 3B.<\/strong>\u3002<\/li>\n\n\n\n<li><strong>For research and high-performance requirements, the full version of 176B can be used, but it requires distributed multi-GPU deployment.<\/strong><\/li>\n<\/ul>\n\n\n\n<p><strong>For a detailed version list, please see<\/strong>\uff1a<a href=\"https:\/\/huggingface.co\/bigscience\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >BLOOM Model List<\/a><\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Can bloom be used in commercial products?<\/h3>\n\n\n\n<p>Under the open-source RAIL license, BLOOM can generally be used for commercial applications (as long as it is not illegal or used in high-risk\/violation scenarios), but it is recommended to read the license agreement carefully to ensure that you do not violate any additional terms. For commercial calls to cloud APIs, additional fees will apply according to the Hugging Face platform&#039;s terms.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Can BLOOM be customized and fine-tuned? Is it useful for using custom data?<\/h3>\n\n\n\n<p>BLOOM is designed for<strong>Transferable\/Fine-tunable<\/strong>The AI training model has been developed, and the development team and community have provided various practical solutions for fine-tuning. Based on the publicly available Transformers toolkit, developers can quickly adapt BLOOM to their own datasets for downstream tasks such as classification, annotation, and generation.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>Fine-tuning tutorial\/practical guide: (For reference)<a href=\"https:\/\/huggingface.co\/docs\/transformers\/model_doc\/bloom\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >Official documentation<\/a>And community sharing.<\/p>\n<\/blockquote>\n\n\n\n<h2 class=\"wp-block-heading\">End<\/h2>\n\n\n\n<p>BLOOM has become a milestone in promoting the democratization of NLP and open collaboration in AI. Its multilingual capabilities and open ecosystem have created unprecedented fertile ground for innovation for global developers and AI training model enthusiasts. Whether it&#039;s scientific research experiments, language diversity protection, or intelligent product prototype development, BLOOM provides you with a flexible, professional, open, and high-performance new paradigm for AI. If you are interested in experiencing the power of cutting-edge AI, visit [website name] now.<a href=\"https:\/\/huggingface.co\/docs\/transformers\/model_doc\/bloom\" target=\"_blank\"  rel=\"nofollow noopener\"  class=\"external\" >BLOOM Official Documentation<\/a>Embark on your exploration journey and work together to promote the prosperity and development of AI technology.<\/p>","protected":false},"author":3,"comment_status":"open","ping_status":"closed","template":"","meta":{"_crsspst_to_aicatswiki":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0},"content_visibility":[262],"sitetag":[17,687,812,813,811],"favorites":[577],"class_list":{"0":"post-19271","1":"sites","2":"type-sites","3":"status-publish","4":"hentry","5":"sitetag-ai","10":"favorites-ai-models"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/sites\/19271","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/sites"}],"about":[{"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/types\/sites"}],"author":[{"embeddable":true,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/comments?post=19271"}],"version-history":[{"count":1,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/sites\/19271\/revisions"}],"predecessor-version":[{"id":19345,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/sites\/19271\/revisions\/19345"}],"wp:attachment":[{"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/media?parent=19271"}],"wp:term":[{"taxonomy":"content_visibility","embeddable":true,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/content_visibility?post=19271"},{"taxonomy":"sitetag","embeddable":true,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/sitetag?post=19271"},{"taxonomy":"favorites","embeddable":true,"href":"https:\/\/aicats.wiki\/en\/wp-json\/wp\/v2\/favorites?post=19271"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}