Model

Shusheng·Puyu Homepage, Documentation and Downloads- Large Multilingual Language Model- News Fast Delivery

Shusheng Puyu (InternLM) is a multilingual large-scale language model developed by Shanghai Artificial Intelligence Laboratory and SenseTime (equal contribution), Chinese University of Hong Kong, Fudan University and Shanghai Jiaotong University. InternLM is a multilingual base language model with 104B parameters. InternLM is pre-trained with a multi-stage progressive process on a large corpus with 1.6T tokens, […]

Shusheng·Puyu Homepage, Documentation and Downloads- Large Multilingual Language Model- News Fast Delivery Read More »

Active Learning Algorithm Model Framework SALF

SALF (Simple Active Learning Framework) is an algorithm model framework for active learning sampling and labeling of data, dedicated to providing a better basic framework and sampling evaluation mode for the field of active learning. The framework implements active learning interfaces for various deep learning tasks, including image classification, semantic segmentation, and image captioning. In

Active Learning Algorithm Model Framework SALF Read More »

Hua Tuo GPT home page, documents and downloads- Open source Chinese medical model- News Fast Delivery

HuatuoGPT (Huatuo GPT) is an open source Chinese medical large model, based on doctor’s reply and ChatGPT reply, let the language model become a doctor, and provide rich and accurate consultation. HuatuoGPT is committed to making the language model have the ability to diagnose and provide useful information like a doctor by fusing the “distilled

Hua Tuo GPT home page, documents and downloads- Open source Chinese medical model- News Fast Delivery Read More »

Causal Decoder Large Model Falcon-40B

Falcon-40B is a 40 billion parameter causal decoder model trained on 1000B tokens from RefinedWeb and augmented with a curated dataset. It tops Huggingface’s OpenLLM leaderboard and outperforms LLaMA, MPT, RedPajama, and StableLM, among others. Built using custom tools, Falcon-40B includes a unique data pipeline that pulls training data from public networks. After Falcon grabs

Causal Decoder Large Model Falcon-40B Read More »

Adversarial active learning data annotation sampling model SRAAL based on state relabeling

SRAAL is an open-source code for an adversarial active learning algorithm based on state relabeling. The algorithm describes active learning in the case of limited data labeling budget. The generative model VAE is used to carry out unsupervised feature reconstruction learning based on the variational process on the data, and the state relabeling method is

Adversarial active learning data annotation sampling model SRAAL based on state relabeling Read More »

Chinese-English Bilingual Large Language Model CPM-Bee

CPM-Bee is a completely open source, commercially available tens of billions of parameter Chinese and English pedestal models. It adopts the Transformer auto-regressive architecture (auto-regressive), uses trillions of high-quality corpus for pre-training, and has strong basic capabilities. The characteristics of CPM-Bee can be summarized as follows: Open source and commercially available: OpenBMB has always adhered

Chinese-English Bilingual Large Language Model CPM-Bee Read More »

Serge Homepage, Documentation and Downloads – Alpaca Model Chat Interface – News Fast Delivery

Serge is a chat interface based on llama.cpp for running alpaca models. Fully self-hosted, no API key required. Fits in 4GB RAM and runs on CPU. SvelteKit front end Redis for storing chat records and parameters FastAPI + langchain for API, wrapping calls to llama.cpp with python bindings use Setting up

Serge Homepage, Documentation and Downloads – Alpaca Model Chat Interface – News Fast Delivery Read More »

ChatFred Homepage, Documentation and Downloads – AI Model Workflow – News Fast Delivery

ChatFred is a Alfred workflow for chatting, image generation, and more using ChatGPT, DALL·E 2, and other models. set up Install on Alfred Gallery or download via GitHub and add OpenAI API key. usage To start a conversation with ChatGPT, use keywords cf set the workflow as a fallback search in Alfred, or create a

ChatFred Homepage, Documentation and Downloads – AI Model Workflow – News Fast Delivery Read More »

DB-GPT Homepage, Documentation and Download- Database Large Language Model- News Fast Delivery

DB-GPT is an open source database-based GPT experimental project, using localized GPT large models to interact with data and environments, with no risk of data leakage, 100% private, and 100% secure. DB-GPT builds a complete set of private large model solutions for all database-based scenarios. Because this solution supports local deployment, it can not only

DB-GPT Homepage, Documentation and Download- Database Large Language Model- News Fast Delivery Read More »

CatAI Homepage, Documentation and Downloads- Alpaca Model Local Dialogue Client- News Fast Delivery

CatAI supports running Alpaca models locally on your computer using a chat GUI. Installation and use Make sure Node.js is installed: npm install -g catai catai install Vicuna-7B catai serve characteristic Automatic detection of programming language Click on the user icon to display the original message live text stream Quick model download #CatAI #Homepage #Documentation

CatAI Homepage, Documentation and Downloads- Alpaca Model Local Dialogue Client- News Fast Delivery Read More »

LaWGPT Homepage, Documentation and Downloads – Large Language Model Based on Chinese Legal Knowledge – News Fast Delivery

LaWGPT is a series of open source large language models based on Chinese legal knowledge. This series of models is based on the general Chinese base model (such as Chinese-LLaMA, ChatGLM, etc.)Large-scale Chinese legal corpus pre-training, which enhances the basic semantic understanding ability of large models in the legal field. on the basis of,Construct a

LaWGPT Homepage, Documentation and Downloads – Large Language Model Based on Chinese Legal Knowledge – News Fast Delivery Read More »