Search Results for "/storage/emulated/0/android/data/net.sourceforge.uiq3.fx603p/files" - Page 5

Showing 106 open source projects for "/storage/emulated/0/android/data/net.sourceforge.uiq3.fx603p/files"

View related business solutions
  • Track time for payroll, billing and productivity Icon
    Track time for payroll, billing and productivity

    Flexible time and billing software that enables teams to easily track time and expenses for payroll, projects, and client billing.

    Because time is money, and we understand how challenging it can be to keep track of employee hours. The constant reminder to log timesheets so your business can increase billables, run an accurate payroll and remove the guesswork from project estimates – we get it.
    Learn More
  • Transforming NetOps Through No-Code Network Automation - NetBrain Icon
    Transforming NetOps Through No-Code Network Automation - NetBrain

    For anyone searching for a complete no-code automation platform for hybrid network observability and AIOps

    NetBrain, founded in 2004, provides a powerful no-code automation platform for hybrid network observability, allowing organizations to enhance their operational efficiency through automated workflows. The platform applies automation across three key workflows: troubleshooting, change management, and assessment.
    Learn More
  • 1
    wav2vec2-large-xlsr-53-portuguese

    wav2vec2-large-xlsr-53-portuguese

    Portuguese ASR model fine-tuned on XLSR-53 for 16kHz audio input

    ...The model performs well without a language model, though adding one can improve word error rate (WER) and character error rate (CER). It achieves a WER of 11.3% (or 9.01% with LM) on Common Voice test data, demonstrating high accuracy for a single-language ASR model. Inference can be done using HuggingSound or via a custom PyTorch script using Hugging Face Transformers and Librosa. Training scripts and evaluation methods are open source and available on GitHub. It is released under the Apache 2.0 license and intended for ASR tasks in Brazilian Portuguese.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    GigaChat 3 Ultra

    GigaChat 3 Ultra

    High-performance MoE model with MLA, MTP, and multilingual reasoning

    ...Its training corpus incorporates ten languages, enriched with books, academic sources, code datasets, mathematical tasks, and more than 5.5 trillion tokens of high-quality synthetic data. This combination significantly boosts reasoning, coding, and multilingual performance across modern benchmarks. Designed for high-performance deployment, GigaChat 3 Ultra supports major inference engines and offers optimized BF16 and FP8 execution paths for cluster-grade hardware.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    DeepSeek-V3.2

    DeepSeek-V3.2

    High-efficiency reasoning and agentic intelligence model

    ...The model was notably used in competitive AI challenges such as the 2025 International Mathematical Olympiad (IMO) and IOI, achieving top-tier results. DeepSeek-V3.2 also features a large-scale agentic task synthesis pipeline, which generates training data to enhance tool-use intelligence and multi-step reasoning. It introduces a new “thinking with tools” chat template, allowing it to reason and decide when to invoke specific tools during problem solving.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4
    Llama-3.2-1B-Instruct

    Llama-3.2-1B-Instruct

    Instruction-tuned 1.2B LLM for multilingual text generation by Meta

    ...It builds upon the Llama 3.1 architecture and incorporates fine-tuning techniques like SFT, DPO, and quantization-aware training for improved alignment, efficiency, and safety. The model supports eight primary languages (including English, Spanish, Hindi, and Thai) and was trained on a curated mix of publicly available online data, with a December 2023 knowledge cutoff. Llama-3.2-1B is lightweight enough for deployment on constrained devices like smartphones, using formats like SpinQuant and QLoRA to reduce model size and latency. Despite its small size, it performs competitively across benchmarks such as MMLU, ARC, and TLDR summarization. The model is distributed under the Llama 3.2 Community License, requiring attribution and adherence to Meta’s Acceptable Use Policy.
    Downloads: 0 This Week
    Last Update:
    See Project
  • New Relic provides the most powerful cloud-based observability platform built to help companies create more perfect software. Icon
    New Relic provides the most powerful cloud-based observability platform built to help companies create more perfect software.

    Get a live and in-depth view of your network, infrastructure, applications, end-user experience, machine learning models and more.

    Correlate issues across your stack. Debug and collaborate from your IDE. AI assistance at every step. All in one connected experience - not a maze of charts.
    Start for Free
  • 5
    Bio_ClinicalBERT

    Bio_ClinicalBERT

    ClinicalBERT model trained on MIMIC notes for clinical NLP tasks

    Bio_ClinicalBERT is a domain-specific language model tailored for clinical natural language processing (NLP), extending BioBERT with additional training on clinical notes. It was initialized from BioBERT-Base v1.0 and further pre-trained on all clinical notes from the MIMIC-III database (~880M words), which includes ICU patient records. The training focused on improving performance in tasks like named entity recognition and natural language inference within the healthcare domain. Notes were...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    Ministral 3 3B Base 2512

    Ministral 3 3B Base 2512

    Small 3B-base multimodal model ideal for custom AI on edge hardware

    Ministral 3 3B Base 2512 is the smallest model in the Ministral 3 family, offering a compact yet capable multimodal architecture suited for lightweight AI applications. It combines a 3.4B-parameter language model with a 0.4B vision encoder, enabling both text and image understanding in a tiny footprint. As the base pretrained model, it is not fine-tuned for instructions or reasoning, making it the ideal foundation for custom post-training, domain adaptation, or specialized downstream tasks....
    Downloads: 0 This Week
    Last Update:
    See Project
MongoDB Logo MongoDB