Model

Vertical large language model TechGPT-7B

TechGPT is a vertical domain large language model released by the “Knowledge Graph Research Group of Northeastern University”. The 7B version that has been fully fine-tuned has been open sourced. Demo: TechGPT-neukg TechGPT mainly strengthens the following three types of tasks: various information extraction tasks such as relational triples extraction with “knowledge graph construction” as …

Vertical large language model TechGPT-7B Read More »

High performance large model reasoning library fastllm

fastllm is a full-platform llm acceleration library implemented in pure C++. Support Python calls, chatglm-6B level model single card can reach 10000+token / s, support glm, llama, moss base, mobile phone runs smoothly. Function overview Pure C++ implementation, easy to transplant across platforms, can be directly compiled on Android ARM platform supports NEON instruction set …

High performance large model reasoning library fastllm Read More »

ChatGLM2-6B Homepage, Documentation and Download- Open Source Bilingual Dialogue Language Model- News Fast Delivery

ChatGLM2-6B is an open source Chinese-English bilingual dialogue model ChatGLM-6B The second-generation version of ChatGLM, on the basis of retaining many excellent features of the first-generation model, such as smooth dialogue and low deployment threshold, ChatGLM2-6B introduces the following new features: more powerful performance: Based on the development experience of the first generation model of …

ChatGLM2-6B Homepage, Documentation and Download- Open Source Bilingual Dialogue Language Model- News Fast Delivery Read More »

baichuan-7B Homepage, Documentation and Download- Open source Chinese and English large model- News Fast Delivery

baichuan-7B is an open source large-scale pre-training model, based on the Transformer structure, a 7 billion parameter model trained on about 1.2 trillion tokens, supports Chinese and English bilingual, and the context window length is 4096. The overall model is based on the standard Transformer structure, using the same model design as LLaMA Location code:rotary-embeddingIt …

baichuan-7B Homepage, Documentation and Download- Open source Chinese and English large model- News Fast Delivery Read More »

Large language model text generation reasoning Text Generation Inference

Rust, Python, and gRPC servers for text generation inference. Used in production by HuggingFace to power LLM’s api inference widget. Features: Serve the most popular large-scale language models using a simple launcher Tensor Parallelism for faster inference on multiple GPUs Continuously batch incoming requests using token streaming of Server-Sent Events (SSE) to improve overall performance …

Large language model text generation reasoning Text Generation Inference Read More »

M3E Homepage, Documentation and Download – Open Source Chinese Embedding Model New SOTA – News Fast Delivery

M3E is the abbreviation of Moka Massive Mixed Embedding Moka, this model is trained by MokaAI, open source and evaluation, used by training script uniem to evaluate BenchMark using MTEB-en Massive, this model passestens of millions (2200w+) Chinese sentences are trained on the data set Mixed, this model supports bilingual homogeneous text …

M3E Homepage, Documentation and Download – Open Source Chinese Embedding Model New SOTA – News Fast Delivery Read More »

NeuMan Homepage, Documentation and Download- Single Video Content Reconstruction Model- News Fast Delivery

NeuMan is a machine learning model developed by Apple that uses neural radiation fields to reconstruct background scenes and animated characters from a single video. environment Create an environment with Conda: conda env create -f environment.yml Alternatively, you can create the environment by executing: conda create -n neuman_env python=3.7 -y; conda activate neuman_env; conda install …

NeuMan Homepage, Documentation and Download- Single Video Content Reconstruction Model- News Fast Delivery Read More »

FHIRModes Homepage, Documentation and Downloads – FHIR Resource Data Model Swift Library – News Fast Delivery

FHIRModes is a Swift library for FHIR® resource data models. feature Native Swift representations of FHIR resources, elements and data types Separate targets for DSTU2, STU3, R4, R4B, R5 and latest builds Mandatory non-nullability of mandatory parameters Enums for most closed code systems support value[x] type of enumeration Date/time parsing, validation, and conversion to and …

FHIRModes Homepage, Documentation and Downloads – FHIR Resource Data Model Swift Library – News Fast Delivery Read More »

Core ML Tools Homepage, Documentation and Downloads- Core ML Model Toolkit- News Fast Delivery

Google protobuf buffer error vulnerability Cross-boundary memory write Google protobuf is a data exchange format of Google (Google). A buffer error vulnerability exists in Google protobuf. A remote attacker could exploit this vulnerability to execute code. Google Guava Code Issue Vulnerability Resource allocation without limits or adjustments Google Guava is a Java core library of …

Core ML Tools Homepage, Documentation and Downloads- Core ML Model Toolkit- News Fast Delivery Read More »

YuLan-Chat Homepage, Documentation and Downloads – Large Language Dialogue Model Fine-tuned Based on Mixed Chinese and English Instructions – News Fast Delivery

YuLan-Chat is a large language dialogue model fine-tuned based on high-quality Chinese-English mixed instructions. YuLan-Chat uses LLaMA as the base, fine-tuned with well-optimized high-quality Chinese-English mixed instructions. Among them, the YuLan-Chat-65B model can significantly surpass the performance of existing open source models on Chinese and English related evaluation data sets. The team said that it …

YuLan-Chat Homepage, Documentation and Downloads – Large Language Dialogue Model Fine-tuned Based on Mixed Chinese and English Instructions – News Fast Delivery Read More »

Multilingual and multitasking large language model TigerBot

TigerBot is a multilingual and multitask large-scale language model (LLM). According to the automatic evaluation of the OpenAI InstructGPT paper on the public NLP dataset, TigerBot-7B achieves 96% of the comprehensive performance of the OpenAI model of the same size. Currently open source: Model: TigerBot-7B, TigerBot-7B-base, TigerBot-180B (research version), Code: Basic training and reasoning code, …

Multilingual and multitasking large language model TigerBot Read More »