Simplify Time-consuming and Overly Complicated Financial Processes.
Cloud Purchase Requisition, Purchase Order & Invoice Approval Software
Zahara's cloud based platform automates budget management, suppliers, purchase requisitions, multi-level purchase approvals, deliveries and invoice reconciliation and approvals. Zahara integrates with most leading accounting software such as QuickBooks Online and Xero to give expanding SME's real time visibility and centralized control of their purchasing.
Zahara can be used to control spend in an organization. We take the initial request to buy something and automate the approval process and sending of the PO to the Vendor. Deliveries can be receipted, vendors invoices matched and processed and then exported to finance.
Zahara adds control yet speeds up processing.
Try it for FREE
CloudZero: The Cloud Cost Optimization Platform
CloudZero automates the collection, allocation, and analysis of your infrastructure and AI spend to uncover waste and improve unit economics.
CloudZero is the leader in proactive cloud cost efficiency. We enable engineers to build cost-efficient software without slowing down innovation. CloudZero's next-generation cloud cost optimization platform automates the collection, allocation, and analysis of cloud costs to uncover savings opportunities and improve unit economics. We are the only platform that enables companies to understand 100% of their operational cloud spend and take an engineering-led approach to optimizing that spend. CloudZero is used by industry leaders worldwide, such as Coinbase, Klaviyo, Miro, Nubank, and Rapid7.
Graphist uses PHP's GD library to produce data plots, in real time, served up as standard images for consumption by web pages (though such images could be saved for use in other document types).
The Genomic Diversity and Phenotype Data Model (GDPDM) captures molecular and phenotypic diversity data. MySQL databases are used to implement the schema. This project develops software tools (written in Java, Perl, etc.) associated with this model.
LIK (Locate Internet Knowledge) is dedicated to find information on the internet and locate this information (web server) and the path (router) between it and you (your personal computer) on a map.
These simple programs are for doing data reduction and then display of columns of tab delimited data. They are designed to be piped together so you can do multiple things at once. This means they typically read from stdin and output to stdout.
Deliver and Track Online and Live Training Fast and Easy with Axis LMS!
Axis LMS targets HR departments for employee or customer training,
Axis LMS enables you to deliver learning and training everywhere through a flexible and easy-to-use LMS that is designed to enhance your training, automate your workflows, and engage your learners.
Universal information crawler is a fast precise and reliable Internet crawler. Uicrawler is a program/automated script which browses the World Wide Web in a methodical, automated manner and creates the index of documents that it accesses.
OpenBRR is an initiative to create an open and standard framework for software assessment process. We strive to provide the industry standard framework in the open source world, and also the trusted data source to aid software assessment process.
Machine learning toolkit for unsupervised and semi-supervised clustering that demonstrates excellent results on real-world data (see Bekkerman et al. ICML-2005 and ECML-2006).
geolocate is a front-end java program that works with google maps to provide dynamic maps to users. Combined with the flexibility of XML and the power of javascript, users can see various relationships on their map to draw conclusions.
We make it cheaper and easier to manage your waitlist, order backlog, and just about any other waiting scenario.
Streamline your customer flow with our SMS-powered waitlist, reservations, and queue management app for restaurants, health care providers, and many other businesses.
A user-friendly open-source toolkit written in Java that lets you visualize and analyze the behaviour of users in the ActiveWorlds family of 3D virtual worlds by mapping them over 2D space.
A code for fast multi-dimensional density estimation . Instead of assuming an a-priori metric definition, it calculates a locally adaptive metric for each data point by using, a Shannon Entropy based, binary space partitioning scheme.
Special Population Planner: A GIS-based emergency planning tool for all-hazards analysis. It is tailored for planning for the needs of persons with special needs, but can be easily adapted to other uses. It runs within ESRI ArcGIS 9.1 to 9.3.0.
FlexCRFs: A Flexible Conditional Random Fields Toolkit for Labeling and Segmenting Sequence Data (this includes a parallel implementation of CRFs called PCRFs to support training CRF models on massively parallel computer systems).
K-automaton is a new parsing (syntactic analysis) machine isomorphous to language. Implemented in Java. Can generate Java code from grammars described in EBNF.
Set of Ant filters that can be used to gather statistics from files or resources. It is mainly used for log files analysis. It allows to: - count inputs - count occurrences of each input - calculate average, max and min values of floats in input
The Internet Soccer Database aims to build a database structure that can contain all fixtures/results/statistics/odds information for any soccer league/competition. Once the structure is defined the data will be populated and made available for analysis
baobab is an implementation of FPTrees or Frequent Pattern Trees, a pattern recognition/data mining technique. it has innumerable applications in language processing, clickstream analysis, etc.
The project aims at developing and maintaining owlutils - a collection of unixish command-line utilities for processing ontologies stored in owl files.
NOTE: Use Fltk_Contour insted this code! FMesh is a simply scientific data visualization tool based on OpenGL. FMesh can generate 2D and 3D graph of contour and color maps for any (x,y,z) scattered data set, many other options are included.
W.H.A.T. is an analytic tool for Wikipedia with two main functionalities: an article network and extensive statistics. It contains a visualization of the article networks and a powerful interface to analyze the behavior of authors.
KML is a knowledge base with support of logical modeling. Advanced model is used to represent knowledge as a set of statements similar to natural language sentences. This project hosts a set of model storage library and server (vrb-ols) and clients.