Model

Text to 3D model Shap-E

Shap-E is a conditional generative model for 3D assets launched by OpenAI. Unlike recent work on 3D generative models, Shap-E directly generates parameters of implicit functions that can be rendered into textured meshes and neural radiation fields. The development team trains Shap-E in two phases: first, an encoder is trained that deterministically maps 3D assets […]

Text to 3D model Shap-E Read More »

PaLM 2 Homepage, Documentation and Downloads – Google’s Next Generation Large Language Model – News Fast Delivery

PaLM 2 It is the next generation of large-scale language models launched by Google.Expertise in advanced reasoning tasks, including code and mathematics, classification and question answering, translation and multilingualism, and natural language generation. Google claims that PaLM 2 is a state-of-the-art language model, outperforming all previous LLMs, including PaLM. Currently, PaLM 2 is used in

PaLM 2 Homepage, Documentation and Downloads – Google’s Next Generation Large Language Model – News Fast Delivery Read More »

WebCPM Homepage, Documentation and Downloads – Chinese AI Question Answering Model Supporting Networking – News Fast Delivery

WebCPM is the first Q&A open source model framework based on interactive web search in the Chinese field. The advantage is that its information retrieval is based on interactive web search, and it can interact with search engines like humans to collect factual knowledge needed to answer questions and generate answers. WebCPM search interface: The

WebCPM Homepage, Documentation and Downloads – Chinese AI Question Answering Model Supporting Networking – News Fast Delivery Read More »

ImageBind Homepage, Documentation and Downloads- Multimodal AI Model- News Fast Delivery

ImageBind is an AI model that supports binding information from six different modalities (image, text, audio, depth, temperature, and IMU data), which unifies this information into a single embedded representation space, enabling machines to more comprehensively , learning directly from multiple information without explicit supervision (i.e., the process of organizing and labeling raw data). ImageBind

ImageBind Homepage, Documentation and Downloads- Multimodal AI Model- News Fast Delivery Read More »

SuperCLUE Homepage, Documentation and Downloads – Chinese General Large Model Evaluation Benchmark – News Fast Delivery

SuperCLUE is an evaluation benchmark for general large models available in Chinese. The main question it answers is: the effect of the Chinese large-scale model under the current situation of vigorous development of the general-purpose large-scale model. including but not limited to: The effect of these models on different tasks To what extent

SuperCLUE Homepage, Documentation and Downloads – Chinese General Large Model Evaluation Benchmark – News Fast Delivery Read More »

Chinese-LLaMA-Alpaca Homepage, Documentation and Downloads- Chinese LLaMA & Alpaca Large Model- News Fast Delivery

Chinese-LLaMA-Alpaca contains the Chinese LLaMA model and the Alpaca large model fine-tuned with instructions. Based on the original LLaMA, these models expand the Chinese vocabulary and use Chinese data for secondary pre-training, which further improves the ability to understand the basic semantics of Chinese. At the same time, the Chinese Alpaca model further uses Chinese

Chinese-LLaMA-Alpaca Homepage, Documentation and Downloads- Chinese LLaMA & Alpaca Large Model- News Fast Delivery Read More »

OpenLLaMA Homepage, Documentation and Downloads – An Open Source Replication of the LLaMA Large Language Model – News Fast Delivery

OpenLLaMA is an open source rendition of Meta AI’s LLaMA large language model under a permissive license. The repository contains a public preview of the trained 200 billion token 7B OpenLLaMA model, and provides PyTorch and Jax weights for the pretrained OpenLLaMA model, as well as evaluation results and comparisons to the original LLaMA model.

OpenLLaMA Homepage, Documentation and Downloads – An Open Source Replication of the LLaMA Large Language Model – News Fast Delivery Read More »

MLC LLM Homepage, Documentation and Downloads – Local Large Language Model – News Fast Delivery

MLC LLM is a general solution that allows any language model to be deployed locally on various hardware backends and native applications. In addition, MLC LLM also provides an efficient framework for users to further optimize model performance according to their needs. MLC LLM is designed to enable everyone to develop, optimize and deploy AI

MLC LLM Homepage, Documentation and Downloads – Local Large Language Model – News Fast Delivery Read More »

WizardLM Homepage, Documentation and Downloads – Fine-tuning Large Language Model Based on LLaMA – News Fast Delivery

WizardLM is a fine-tuned 7B LLaMA model. It fine-tunes following the dialogue through a large number of commands of varying difficulty. The novelty of this model is the use of LLM to automatically generate training data. The WizardLM model uses a new method called Evol-Instruct (a new method to improve the ability of LLM by

WizardLM Homepage, Documentation and Downloads – Fine-tuning Large Language Model Based on LLaMA – News Fast Delivery Read More »

SantaCoder Homepage, Documentation and Downloads- Lightweight AI Programming Model- News Fast Delivery

SantaCoder is a language model with 1.1 billion parameters that can be used for code generation and completion suggestions in several programming languages ​​such as Python, Java, and JavaScript. According to the official information, the basis for training SantaCoder is The Stack (v1.1) data set. Although SantaCoder is relatively small in size, with only 1.1

SantaCoder Homepage, Documentation and Downloads- Lightweight AI Programming Model- News Fast Delivery Read More »

Enlightenment Homepage, Documentation and Downloads – Bilingual Multimodal Large Language Model – News Fast Delivery

“Enlightenment” is a bilingual multimodal pre-training model with a scale of 1.75 trillion parameters.There are currently 7 open source model results in the project, and the model parameter files need to beEnlightenment platformMake a download request. Graphics CogView The CogView parameter is 4 billion. The model can generate images from text, and after fine-tuning, it

Enlightenment Homepage, Documentation and Downloads – Bilingual Multimodal Large Language Model – News Fast Delivery Read More »

Pengcheng·Pangu α Homepage, Documentation and Download- Chinese Pre-trained Language Model- News Fast Delivery

Pengcheng Pangu α is the industry’s first 200 billion parameter pre-trained language model with Chinese as the core. Currently, two versions are open source: Pengcheng Pangu α and Pengcheng Pangu α enhanced version, and support both NPU and GPU. It supports rich scene applications, and has outstanding performance in text generation fields such as knowledge

Pengcheng·Pangu α Homepage, Documentation and Download- Chinese Pre-trained Language Model- News Fast Delivery Read More »