Data Strategy

The structured planning and execution of data initiatives aligned with organizational objectives, driving informed decision-making and maximizing data value.

Application Integration

Application integration is the process of enabling independently designed applications, systems, or software to work together. The goal is to create a seamless flow of information and functionality across different software applications, which might otherwise operate in isolation.

Data Democratization

Data democratization refers to the process of making data accessible to non-technical users within an organization without the intervention of IT specialists or data scientists. The intention is to empower all employees-regardless of their technical expertise-to be able to use data in their decision-making processes.

Data Transformation

Data transformation is a fundamental process in data management and analysis that involves the conversion of data from one format or structure into another. This process is critical for integrating data from one or more sources, ensuring uniformity and consistency in datasets for analysis, reporting, and data-driven decision making.

Enterprise Data Management

Enterprise data management (EDM) is the practice of managing an organization's data to ensure it is accurate, accessible, and secure. It involves the processes, policies, and tools that are used to handle data across an organization.

Predictive Analytics

Predictive analytics is the process of using data to predict future outcomes. It involves using data, statistical algorithms, and machine learning techniques to forecast possible results based on historical data. Organizations use it to identify patterns and trends that can guide future actions. Predictive analysis answers the question, "What is likely to happen or not happen?" It employs statistical models and other techniques to provide a forecast of likely outcomes based on historical data.

Prescriptive Analytics

Prescriptive analytics is the process of using data to determine an appropriate course of action. It involves data analytics tools, including machine-learning algorithms, to examine large data sets and recommend actions. This advanced form of data analysis answers the question, “What should we do?” It predicts future trends and makes suggestions on how to act on them by using optimization and simulation algorithms to recommend specific courses of action.

Serverless Architecture

Serverless architecture is a way of building and running applications and services without having to manage the underlying infrastructure typically associated with computing. In serverless architectures, the cloud provider automatically manages the allocation and provisioning of servers.

Data Management

Practices for organizing, storing, and maintaining data throughout its lifecycle to ensure accuracy, security, and accessibility for informed decision-making and compliance.

Change Data Capture

Change Data Capture (CDC) is a technique used to automatically identify and capture changes made to the data in a database. Instead of processing or transferring the entire database, CDC focuses only on the data that has been altered, such as new entries, updates, or deletions.

Cloud Data Access

Cloud data access refers to the ability to retrieve and manipulate data stored in cloud-based databases, storage systems, or applications.

Data Governance

Data governance is the system of rules, processes, and guidelines concerning how an organization manages its data. Data governance encompasses assigning the people who are responsible for the data, prescribing the rules around how it's processed, transported, and stored, and complying with company and government regulations to ensure data stays protected.

Data Mapping

Data mapping is a process of data management where a 'map' of the data is created to link fields from one database or dataset to those in another. A data map acts as a blueprint, illustrating how each piece of data from the source is associated with data in the target system.

Data Pipeline

A data pipeline is a set of processes and technologies for moving and processing data from one system to another. It typically involves extracting data from various sources, transforming it into a format suitable for analysis, and then loading it into a data storage system for business intelligence, analytics, or other applications.

Data Warehouse

A data warehouse is a central repository of integrated data collected from multiple sources. It stores current and historical data in one single place that is typically used for analysis and reporting. The data stored in the warehouse is uploaded from systems such as marketing or sales. The data may pass through an operational data store and may require data cleansing for additional operations to ensure data quality before it is used in the data warehouse for reporting.

Database Management

Database management refers to the process of efficiently and effectively managing data within a database environment. It includes tasks like data storage, retrieval, updating, and security.

Document Processing

Document processing is the method of handling and organizing documents in both digital and physical formats. It involves various steps such as capturing, sorting, extracting information, and storing documents efficiently.

Hybrid Cloud

A hybrid cloud is a computing environment comprising a mix of on-premises, private cloud, and public cloud services that coordinate between the platforms. It's designed to give organizations greater control over their data and applications by creating a balance between the need for the scalability of public cloud services and the security of private cloud or on-premises infrastructure.

Data Movement

The technologies involved in transferring data from one location or system to another, ensuring efficiency, integrity, and security.


ADO (ActiveX Data Objects) is a Microsoft technology that provides a set of COM (Component Object Model) objects for accessing, editing, and updating data from a variety of sources through a single interface.

Automate SFTP File Transfer

SFTP (secure file transfer protocol) is a specific protocol used to securely send data files from one software system to another. Files can be transferred manually over SFTP using tools like FileZilla or WinSCP, or file transfers can be automated to ensure reliability and speed.

Cloud Migration

Cloud migration refers to the process of moving digital assets (data, applications, IT processes, or entire databases) from on-premises computers to the cloud or moving them from one cloud environment to another.

Data Extraction

Data extraction involves retrieving relevant information from various sources, which can range from databases and websites to documents and multimedia files.

Data Replication

Data replication is a process where data from various sources within an organization is copied to a central location like a database or data warehouse. It improves the availability and accessibility of data, ensuring that all users, regardless of their location, have access to consistent and updated information.

Data Transfer

Data transfer (also called data transmission) is the process of moving or copying data from one location, system, or device to another.


SFTP (SSH File Transfer Protocol) is a secure protocol used to access, transfer, and manage files over a network.


SSIS (SQL Server Integration Services) is a component of Microsoft SQL Server used for data integration, transformation, and migration tasks.

Data Connectivity

Capabilities involved with linking disparate data sources for seamless data exchange, facilitating integration, analysis, and decision-making across systems and platforms.

API Connector

An API connector is a software library, tool, or platform that facilitates the programmatic access to systems provided by APIs. API connectors make it easier for IT teams, developers, and other data consumers to access the data behind APIs.

Cloud Connectivity

Cloud connectivity involves the use of the internet to link tools, applications, machines, and other technologies to cloud service providers (CSPs). These providers offer resources like computing power, storage, platforms, and application hosting.

Cloud Data Warehouse

A cloud data warehouse is a centralized data repository hosted on a cloud computing platform. Unlike traditional on-premises data warehouses, there is no upfront investment in hardware and infrastructure; instead, it leverages the cloud provider's resources. The key advantages of a cloud data warehouse include enhanced accessibility, reliability, and security. See: data warehouse

Cloud Managed File Transfer

Cloud managed file transfer (Cloud MFT) is a technology service that allows organizations to share files and data securely over the internet using cloud infrastructure. Unlike traditional managed file transfer (MFT), cloud MFT operates in a cloud environment, enabling organizations to manage file transfers without the need to invest in and maintain physical servers. See: Managed file transfer

Data Virtualization

Data virtualization is a technology that coordinates real-time or near real-time data from different sources into coherent, self-service data services. This process supports a range of business applications and workloads, enabling data to be accessed and connected in real time without the need for replication or movement.


JDBC (Java Database Connectivity) is a Java API that enables Java programs to execute SQL statements and interact with databases.

Managed File Transfer

Managed file transfer (MFT) is a technology platform that enables organizations to share electronic information in a secure way across different systems or organizations. It goes beyond simple file transfer protocol (FTP), hypertext transfer protocol (HTTP), and secure file transfer protocol (SFTP) methods by incorporating encryption, standardized delivery mechanisms, and tracking to ensure the safety and integrity of the data.


ODBC (Open Database Connectivity) is a standard API that allows applications to access data from various database management systems (DBMSs).

Data for B2B Integration

The processes and technologies facilitating seamless communication and collaboration between businesses, streamlining transactions and enhancing efficiency in supply chain operations.


AS2 and EDI are two technologies used together to transfer business documents and messages between the computer systems of separate companies. It is a universal document standard that's been around for many years and offers maximum interoperability between trading partners.

Different Types of EDI

EDI (electronic data interchange) is a collection of standards that specify how business documents can be understood by different software systems, even if they are not compatible with each other. The two most prominent types of EDI are X12 and EDIFACT.

EDI 210

An EDI 210 is a type of X12 EDI document called a Motor Carrier Freight Details and Invoice. The document is sent by shipment carriers (e.g., FedEx, USPS) to companies that have requested the use of their trucks, planes, and ships to carry goods.

EDI 214

An EDI 214 is a type of X12 EDI document called a Transportation Carrier Shipment Status Message. The document is sent by shipment carriers (e.g., FedEx, USPS) to companies that have requested the use of their trucks, planes, and ships to carry goods.

EDI 240

An EDI 240 is a type of X12 EDI document called a Motor Carrier Package Status. It is exchanged between logistics providers and shipment carriers (e.g., FedEx, USPS) to provide updates on the status of shipped goods.

EDI 810

An EDI 810 is a type of X12 EDI document called an Invoice. It provides the same function as a paper or electronic invoice, including purchase details, item details, and the amount owed.

EDI 835

An EDI 835 document is a specific type of X12 EDI message called an electronic remittance advice (ERA). Healthcare insurance providers send EDI 835 documents to healthcare service providers, like hospitals, when the insurance provider has approved payment for specific claims submitted by the service provider.

EDI 846

An EDI 846 is a type of digital business document called the Inventory Inquiry/Advice. It standardizes the format of an electronic message that businesses use to communicate inventory levels, whether to inquire about the inventory status of a supplier or to advise a customer or partner about product availability.

EDI 850

An EDI 850 is a type of X12 EDI document called a Purchase Order. It provides the same function as a paper or electronic purchase order and contains the same information.

EDI 856

An EDI 856 is a type of X12 EDI document called an Advance Shipment Notice (ASN). An ASN indicates that ordered items are being prepared for shipment and includes details on expected delivery.

FTP Port

One of two ports (Port 21 and Port 20) that serves a specific role in the FTP communication process.

FTP Server

An FTP server is a server that uses a specialized software application that uses the File Transfer Protocol (FTP) to store and manage files. It acts as a digital storage hub, allowing users to upload or download files to and from the server over a network or the internet.

Other Data Technologies

Other tools, platforms, and methodologies employed for data collection, storage, processing, analysis, and visualization to support organizational objectives and decision-making processes.

Business Rules

Business rules refer to the guidelines or principles that dictate how various aspects of a business should operate. They encompass the procedures, policies, and conditions that guide decision-making and actions within an organization.

Data Enrichment

Data enrichment is a process in which raw data is enhanced by adding information from additional sources, thereby increasing its value and utility. This involves taking basic data, which might be incomplete or insufficient for certain purposes, and supplementing it with relevant and complementary details.