Showing 126 open source projects for "sandbox:/mnt/data/project_plan.pod"

View related business solutions
  • Creatio Low-Code Development Platform Icon
    Creatio Low-Code Development Platform

    Automate any business idea in minutes with Studio Creatio Enterprise

    Intelligent low-code platform to empower both IT and non-IT staff to effortlessly build enterprise-grade apps and processes
    Learn More
  • Workable Hiring Software - Hire The Best People, Fast Icon
    Workable Hiring Software - Hire The Best People, Fast

    Find the best candidates with the best recruitment software

    Workable is the preferred software for today's recruiting industry and HR teams, trusted by over 6,000 companies to streamline their hiring processes. Finding the right person for the job has never been easier—users now possess the ability to manage multiple hiring pipelines at once, from posting a job to sourcing candidates. Workable is also seamlessly integrated between desktop and mobile, allowing admins full control and flexibility all in the ATS without needing additional software.
    Learn More
  • 1
    Google Workspace MCP Server

    Google Workspace MCP Server

    Control Gmail, Google Calendar, Docs, Sheets, Slides, Chat, Forms

    ...The system is designed to operate as a backend service that integrates with AI applications such as coding agents, automation tools, and conversational assistants. Authentication is handled through OAuth-based flows that allow both single-user and multi-user environments while maintaining access control over Workspace data.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    LLM Foundry

    LLM Foundry

    LLM training code for MosaicML foundation models

    Introducing MPT-7B, the first entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and code. It is open source, available for commercial use, and matches the quality of LLaMA-7B. MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k. Large language models (LLMs) are changing the world, but for those outside well-resourced industry labs, it can be extremely difficult to train and deploy...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    Canopy

    Canopy

    Retrieval Augmented Generation (RAG) framework

    Canopy is an open-source retrieval-augmented generation (RAG) framework developed by Pinecone to simplify the process of building applications that combine large language models with external knowledge sources. The system provides a complete pipeline for transforming raw text data into searchable embeddings, storing them in a vector database, and retrieving relevant context for language model responses. It is designed to handle many of the complex components required for a RAG workflow, including document chunking, embedding generation, prompt construction, and chat history management. Developers can use Canopy to quickly build chat systems that answer questions using their own data instead of relying solely on the pretrained knowledge of the language model. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    LangChain Extract

    LangChain Extract

    Did you say you like data?

    LangChain Extract is an open-source reference application designed to demonstrate how large language models can be used to extract structured data from unstructured text and document files. The project implements a lightweight web service that allows developers to define extraction schemas and apply them to various sources such as plain text, HTML, or PDF documents. Built using FastAPI and the LangChain framework, the application exposes a REST API that can process documents and return structured outputs that match user-defined JSON schemas. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Enterprise-Class Managed File Transfer. Icon
    Enterprise-Class Managed File Transfer.

    For organizations that need to automate secure file transfers to protect sensitive data.

    Diplomat MFT by Coviant Software is a secure, reliable managed file transfer solution designed to simplify and automate SFTP, FTPS, and HTTPS file transfers. Built for seamless integration, Diplomat MFT works across major cloud storage platforms, including AWS S3, Azure Blob, Google Cloud, Oracle Cloud, SharePoint, Dropbox, Box, and more.
    Learn More
  • 5
    local-llm

    local-llm

    Run LLMs locally on Cloud Workstations

    ...The repository includes tools, Docker configurations, and command-line utilities that simplify the process of downloading, running, and interacting with language models directly on local or cloud-based workstations. This approach improves data privacy and control, as all inference can be performed locally without sending sensitive information to external APIs. It also integrates seamlessly with Google Cloud services, allowing developers to build and test AI-powered applications within the broader cloud ecosystem.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 6
    Streamline Analyst

    Streamline Analyst

    AI agent that streamlines the entire process of data analysis

    Streamline Analyst is a cutting-edge, open-source application powered by Large Language Models (LLMs) designed to revolutionize data analysis. This Data Analysis Agent effortlessly automates all the tasks such as data cleaning, preprocessing, and even complex operations like identifying target objects, partitioning test sets, and selecting the best-fit models based on your data. With Streamline Analyst, results visualization and evaluation become seamless.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    ToRA

    ToRA

    Tool-integrated Reasoning LLM Agents

    ...This approach allows the model to reason step by step in natural language and then execute precise calculations or code through tool calls, creating a hybrid reasoning workflow. The framework was designed to address known weaknesses of large language models in mathematical problem solving and formal reasoning tasks. Training data includes tool-use trajectories that teach the model when to reason verbally and when to delegate tasks to specialized tools.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Chinese-LLaMA-Alpaca 2

    Chinese-LLaMA-Alpaca 2

    Chinese LLaMA-2 & Alpaca-2 Large Model Phase II Project

    ...The Chinese LLaMA-2 base model and the Alpaca-2 instruction fine-tuning large model are open-sourced. These models expand and optimize the Chinese vocabulary on the basis of the original Llama-2, use large-scale Chinese data for incremental pre-training, and further improve the basic semantics and command understanding of Chinese. Performance improvements. The related model supports FlashAttention-2 training, supports 4K context and can be extended up to 18K+ through the NTK method.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    autollm

    autollm

    Ship RAG based LLM web apps in seconds

    ...The framework also includes built-in readers for multiple content sources such as PDFs, DOCX files, notebooks, websites, and other document types, which helps shorten the time between raw data and a working knowledge application.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Ditto Edge Server is a lightweight standalone server for resource-constrained edge environments, based on the core Ditto Edge SDK. Icon
    Ditto Edge Server is a lightweight standalone server for resource-constrained edge environments, based on the core Ditto Edge SDK.

    With Ditto Edge Server, you can join devices as small as a Raspberry Pi to a local mesh network and synchronize data across edge environments.

    Ditto's Edge SDK is the only thing your edge devices need to ensure your application is operational in any environment, regardless of network conditions.
    Learn More
  • 10
    towhee

    towhee

    Framework that is dedicated to making neural data processing

    Towhee is an open-source machine-learning pipeline that helps you encode your unstructured data into embeddings. You can use our Python API to build a prototype of your pipeline and use Towhee to automatically optimize it for production-ready environments. From images to text to 3D molecular structures, Towhee supports data transformation for nearly 20 different unstructured data modalities. We provide end-to-end pipeline optimizations, covering everything from data decoding/encoding, to model inference, making your pipeline execution 10x faster. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    Alpaca-CoT

    Alpaca-CoT

    We unified the interfaces of instruction-tuning data

    ...The repository includes datasets, training scripts, and examples demonstrating how chain-of-thought data can be used to fine-tune language models. It also explores how reasoning traces generated by larger models can be distilled into smaller models.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 12
    LLaMA-MoE

    LLaMA-MoE

    Building Mixture-of-Experts from LLaMA with Continual Pre-training

    LLaMA-MoE is an open-source project that builds mixture-of-experts language models from LLaMA through expert partitioning and continual pre-training. The repository is centered on making MoE research more accessible by offering smaller and more affordable models with only about 3.0 to 3.5 billion activated parameters, which helps reduce deployment and experimentation costs. Its architecture works by splitting LLaMA feed-forward networks into sparse experts and adding gating mechanisms so...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 13
    DB-GPT-Hub

    DB-GPT-Hub

    A repository that contains models, datasets, and fine-tuning

    ...The project serves as a specialized extension of the broader DB-GPT ecosystem, focusing on the preparation and evaluation of models capable of translating natural language questions into structured database queries. It offers a modular framework that supports data preparation, model fine-tuning, benchmarking, and inference for Text-to-SQL systems. The repository includes datasets and experiment configurations that allow researchers to train models on real database schemas and evaluate them using standardized benchmarks. Its design encourages experimentation with different large language models and fine-tuning techniques, including parameter-efficient training approaches.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 14
    RAGs

    RAGs

    Build ChatGPT over your data, all with natural language

    ...Users can also inspect and adjust parameters such as the number of retrieved documents, summarization strategies, and query settings through a configuration interface. Once the pipeline is created, the system enables conversational queries over the connected data sources, effectively creating a personalized knowledge assistant.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 15
    Autolabel

    Autolabel

    Label, clean and enrich text datasets with LLMs

    Autolabel is a Python library to label, clean and enrich datasets with Large Language Models (LLMs). Autolabel data for NLP tasks such as classification, question-answering and named entity recognition, entity matching and more. Seamlessly use commercial and open-source LLMs from providers such as OpenAI, Anthropic, HuggingFace, Google and more.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 16
    Chinese Llama 2 7B

    Chinese Llama 2 7B

    The first Chinese LLaMA2 model in the open source community

    Chinese Llama 2 7B is an open-source large language model adapted from the LLaMA-2 architecture and optimized for Chinese and bilingual Chinese-English applications. The project provides a version of LLaMA-2 that has been further trained on Chinese data so it can better understand and generate text in Chinese while maintaining compatibility with the original model ecosystem. In addition to the model weights, the repository also includes supervised fine-tuning datasets and training resources that help developers build chat-optimized versions of the model. The project follows the input format used by the LLaMA-2 chat architecture, ensuring compatibility with existing optimization techniques and tools built for the LLaMA-2 ecosystem. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 17
    LangChain Apps on Production with Jina

    LangChain Apps on Production with Jina

    Langchain Apps on Production with Jina & FastAPI

    ...You can benefit from the scalability and serverless architecture of the cloud without sacrificing the ease and convenience of local development. And if you prefer, you can also deploy your LangChain apps on your own infrastructure to ensure data privacy. With long chain-serve, you can craft REST/WebSocket APIs, spin up LLM-powered conversational Slack bots, or wrap your LangChain apps into FastAPI packages on the cloud or on-premises.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    Chinese-LLaMA-Alpaca-2 v2.0

    Chinese-LLaMA-Alpaca-2 v2.0

    Chinese LLaMA & Alpaca large language model + local CPU/GPU training

    This project has open-sourced the Chinese LLaMA model and the Alpaca large model with instruction fine-tuning to further promote the open research of large models in the Chinese NLP community. Based on the original LLaMA , these models expand the Chinese vocabulary and use Chinese data for secondary pre-training, which further improves the basic semantic understanding of Chinese. At the same time, the Chinese Alpaca model further uses Chinese instruction data for fine-tuning, which significantly improves the model's ability to understand and execute instructions.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    ThoughtSource

    ThoughtSource

    A central, open resource for data and tools

    ThoughtSource is a central, open resource and community centered on data and tools for chain-of-thought reasoning in large language models (Wei 2022). Our long-term goal is to enable trustworthy and robust reasoning in advanced AI systems for driving scientific research and medical practice.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 20
    langchain-prefect

    langchain-prefect

    Tools for using Langchain with Prefect

    ...We need to know details about how our apps work, even when we want to use tools with convenient abstractions that may obfuscate those details. Prefect is built to help data people build, run, and observe event-driven workflows wherever they want. It provides a framework for creating deployments on a whole slew of runtime environments (from Lambda to Kubernetes), and is cloud agnostic (best supports AWS, GCP, Azure). For this reason, it could be a great fit for observing apps that use LLMs. RecordLLMCalls is a ContextDecorator that can be used to track LLM calls made by Langchain LLMs as Prefect flows. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 21
    VALL-E

    VALL-E

    PyTorch implementation of VALL-E (Zero-Shot Text-To-Speech)

    ...Specifically, we train a neural codec language model (called VALL-E) using discrete codes derived from an off-the-shelf neural audio codec model, and regard TTS as a conditional language modeling task rather than continuous signal regression as in previous work. During the pre-training stage, we scale up the TTS training data to 60K hours of English speech which is hundreds of times larger than existing systems. VALL-E emerges in-context learning capabilities and can be used to synthesize high-quality personalized speech with only a 3-second enrolled recording of an unseen speaker as an acoustic prompt. Experiment results show that VALL-E significantly outperforms the state-of-the-art zero-shot TTS system in terms of speech naturalness and speaker similarity. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 22
    ChatGenTitle

    ChatGenTitle

    A paper title generation model fine-tuned on the LLaMA model

    ChatGenTitle: A paper title generation model fine-tuned on the LLaMA model using information from millions of arXiv papers.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 23
    Alpa

    Alpa

    Training and serving large-scale neural networks

    Alpa is a system for training and serving large-scale neural networks. Scaling neural networks to hundreds of billions of parameters has enabled dramatic breakthroughs such as GPT-3, but training and serving these large-scale neural networks require complicated distributed system techniques. Alpa aims to automate large-scale distributed training and serving with just a few lines of code.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 24
    Emb-GAM

    Emb-GAM

    An interpretable and efficient predictor using pre-trained models

    Deep learning models have achieved impressive prediction performance but often sacrifice interpretability, a critical consideration in high-stakes domains such as healthcare or policymaking. In contrast, generalized additive models (GAMs) can maintain interpretability but often suffer from poor prediction performance due to their inability to effectively capture feature interactions. In this work, we aim to bridge this gap by using pre-trained neural language models to extract embeddings for...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    DomE

    DomE

    Implements a reference architecture for creating information systems

    ...The architecture comprises elements that guarantee user access through automatically generated interfaces for various devices, integration with external information sources, data and operations security, automatic generation of analytical information, and automatic control of business processes. All these features are generated from the domain model, which is, in turn, continuously evolved from interactions with the user or autonomously by the system itself. Thus, an alternative to the traditional software production processes is proposed, which involves several stages and different actors, sometimes demanding a lot of time and money without obtaining the expected result. ...
    Downloads: 0 This Week
    Last Update:
    See Project
MongoDB Logo MongoDB