markdown
stringlengths
44
160k
filename
stringlengths
3
39
--- title: AI accelerators description: Review comprehensive workflows, notebooks, and tutorials that help you find complete examples of common data science and machine learning workflows. --- # AI accelerators {: #ai-accelerators } AI Accelerators are designed to help speed up model experimentation, development, an...
index
--- title: Enrich data using the Hyperscaler API description: Call the GCP API and enrich a modeling dataset that predicts customer churn. --- # Enrich data using the Hyperscaler API {: #enrich-data-using-the-hyperscaler-api } [Access this AI accelerator on GitHub <span style="vertical-align: sub">:material-arrow-ri...
enrich-gcp
--- title: Select models using custom metrics description: This AI Accelerator demonstrates how one can leverage DataRobot's Python client to extract predictions, compute custom metrics, and sort their DataRobot models accordingly. --- # Select models using custom metrics {: #select-models-using-custom-metrics } [Ac...
ai-custom-metrics
--- title: Deploy Scoring Code as a microservice description: Follow a step-by-step procedure to embed Scoring Code in a microservice and prepare it as the Docker container for a deployment on customer infrastructure (it can be self- or hyperscaler-managed K8s). --- # Deploy Scoring Code as a microservice {: #deploy-...
sc-micro
--- title: End-to-end ML workflow with Snowflake description: Work with Snowflake and DataRobot's Python client to import data, build and evaluate models, and deploy a model into production to make new predictions. --- # End-to-end ML workflow with Snowflake {: #end-to-end-ml-workflow-with-snowflake } [Access this A...
ml-snowflake
--- title: Migrate a model to a new cluster description: Download a deployed model from DataRobot cluster X, upload it to DataRobot cluster Y, and then deploy and make requests from it. --- # Migrate a model to a new cluster {: #migrate-a-model-to-a-new-cluster} [Access this AI accelerator on GitHub <span style="ver...
model-migrate
--- title: End-to-end time series demand forecasting Workflow description: Perform large-scale demand forecasting using DataRobot's Python package. --- # End-to-end time series demand forecasting workflow {: #end-to-end-time-series-demand-forecasting-workflow} [Access this AI accelerator on GitHub <span style="vertic...
demand-flow
--- title: End-to-end ML workflow with AWS description: Work with AWS and DataRobot's Python client to import data, build and evaluate models, and deploy a model into production to make new predictions. --- # End-to-end ML workflow with AWS {: #end-to-end-ml-workflow-with-aws} Being one of the largest cloud provider...
ml-aws
--- title: Mastering tables in production ML description: Review an AI accelerator that uses a repeatable framework for a production pipeline from multiple tables. --- # Mastering tables in production ML {: #mastering-tables-in-production-ml } [Access this AI accelerator on GitHub <span style="vertical-align: sub">:...
ml-tables
--- title: Feature Reduction with FIRE description: Learn about the benefits of Feature Importance Rank Ensembling (FIRE)&mdash;a method of advanced feature selection that uses a median rank aggregation of feature impacts across several models created during a run of Autopilot. --- # Feature Reduction with FIRE {: #fe...
fire
--- title: API Quickstart description: Learn how to begin using the DataRobot REST API to create projects and generate predictions. --- # API quickstart {: #api-quickstart } The DataRobot API provides a programmatic alternative to the web interface for creating and managing DataRobot projects. The API can be used vi...
index
--- title: API reference documentation description: Review the reference documentation available for DataRobot's programmatic tools. --- # API reference documentation {: #api-documentation-home } The table below outlines the reference documentation available for DataRobot's programmatic tools. <!--private start--> ...
index
--- title: Common use cases description: Review Jupyter notebooks that outline common use cases for version 3.x of DataRobot's Python client. --- # Common use cases {: #common-use-cases } Review Jupyter notebooks that outline common use cases and machine learning workflows using version 3.x of DataRobot's Python cli...
index
--- title: Make Visual AI predictions via the API dataset_name: N/A description: Learn how to make predictions on Visual AI projects with API calls. domain: DSX expiration_date: 06-01-2022 owner: nathan.goudreault@datarobot.com url: docs.datarobot.com/en/tutorials/using-the-api/vai-pred.html --- # Make Visual AI predi...
vai-pred
--- title: Make batch predictions with Azure Blob storage dataset_name: N/A description: Use the DataRobot Python Client package to set up a batch prediction job that reads an input file for scoring from Azure Blob storage and then writes the results back to Azure. domain: DSX expiration_date: 06-01-2022 owner: nathan....
azure-pred
--- title: Python code examples description: Review comprehensive workflows, notebooks, and tutorials that help you find complete examples of common data science and machine learning workflows. --- # Python code examples {: #python-code-examples } The API user guide includes overviews and workflows for DataRobot's P...
index
--- title: Make batch predictions with Google Cloud storage dataset_name: N/A description: Learn how to make and write predictions to Google Cloud Storage. domain: DSX expiration_date: 06-01-2022 owner: nathan.goudreault@datarobot.com url: docs.datarobot.com/en/tutorials/using-the-api/gcs-pred.html --- # Make batch pr...
gcs-pred
--- title: R client v2.29 reference documentation description: Learn about the new methods in version 2.29 of DataRobot's R client available for public preview. --- # R client v2.29 reference documentation The table below outlines the major classes for the methods introduced in v2.29 of the R client. Each endpoint ...
r-ref
--- title: Public preview R client v2.29 description: Learn about the new features available for public preview in version 2.29 of DataRobot's R client. --- # Public preview R client v2.29 Now available for public preview, DataRobot has released [version 2.29 of the R client](https://github.com/datarobot/rsdk/releas...
index
--- title: DataRobot apicore package walkthrough description: Learn how to access, install, and use DataRobot's apicore package for the R client. --- # DataRobot apicore package walkthrough {: #datarobot-apicore-package-walkthrough } In v2.29 of the R Client, available for public preview, DataRobot has introduced a ...
apicore
--- title: REST API code examples description: Review comprehensive workflows, notebooks, and tutorials that help you find complete examples of common data science and machine learning workflows. --- # REST API code examples {: #rest-api-code-examples } The API user guide includes overviews and workflows for DataRob...
index
--- title: R code examples description: Review comprehensive workflows, notebooks, and tutorials that help you find complete examples of common data science and machine learning workflows. --- # R code examples {: #r-code-examples } The API user guide includes overviews and workflows for DataRobot's R client that ou...
index
--- title: Price elasticity of demand modeling description: Understand the impact that changes in price will have on consumer demand for a given product. --- # Price elasticity of demand modeling {: #price-elasticity-of-demand-modeling } The price elasticity of demand is a measurement for how demand for a product is...
index
--- title: Python v2.x use cases description: Review Jupyter notebooks that outline common use cases for version 2.x of DataRobot's Python client. --- # Python v2.x use cases {: #python-v2x-use-cases } Review Jupyter notebooks that outline common use cases and machine learning workflows using DataRobot's Python clie...
index
--- title: No-show appointment forecasting description: Build a model that identifies patients most likely to miss appointments, with correlating reasons. This data can then be used by staff to target outreach on those patients and additionally to understand, and perhaps address, associated issues. --- # No-show appo...
index
--- title: Predict steel plate defects description: Help manufacturers significantly improve the efficiency and effectiveness of identifying defects of all kinds, including those for steel sheets. --- # Predict steel plate defects {: #predict-steel-plate-defects } Steel plates, whether used for general construction,...
index
--- title: Predict the likelihood of a loan default description: AI models for predicting the likelihood of a loan default can be deployed within the review process to score and rank all new flagged cases. --- # Predict the likelihood of a loan default {: #predict-the-likelihood-of-a-loan-default } This page outline...
index
--- title: Anti-Money Laundering (AML) Alert Scoring description: Build a model that uses historical data, including customer and transactional information, to identify which alerts resulted in a Suspicious Activity Report (SAR). --- # Anti-Money Laundering (AML) Alert Scoring {: #anti-money-laundering-aml-alert-scor...
index
--- title: Predict late shipments description: This page outlines a use case to predict whether a shipment will be late or if there will be a shortage of parts. --- # Predict late shipments {: #predict-late-shipments } This page outlines a use case to predict whether a shipment will be late or if there will be a sho...
index
--- title: Reduce 30-Day readmissions rate description: This page outlines a use case to reduce the 30-day readmission rate at a hospital. This use case is captured in a Jupyter notebook that you can download and execute. --- # Reduce 30-Day readmissions rate {: #reduce-30-day-readmissions-rate } This page outlines ...
index
--- title: Triage insurance claims description: Evaluate the severity of an insurance claim in order to triage it effectively. --- # Triage insurance claims {: #triage-insurance-claims } This page outlines a use case that assesses claim complexity and severity as early as possible to optimize claim routing, ensure t...
index
--- title: Price elasticity of demand modeling description: Understand the impact that changes in price will have on consumer demand for a given product. --- # Price elasticity of demand modeling {: #price-elasticity-of-demand-modeling } The price elasticity of demand is a measurement for how demand for a product is...
index
--- title: No-show appointment forecasting description: Build a model that identifies patients most likely to miss appointments, with correlating reasons. This data can then be used by staff to target outreach on those patients and additionally to understand, and perhaps address, associated issues. --- # No-show appo...
index
--- title: Predict steel plate defects description: Help manufacturers significantly improve the efficiency and effectiveness of identifying defects of all kinds, including those for steel sheets. --- # Predict steel plate defects {: #predict-steel-plate-defects } Steel plates, whether used for general construction,...
index
--- title: Predict the likelihood of a loan default description: AI models for predicting the likelihood of a loan default can be deployed within the review process to score and rank all new flagged cases. --- # Predict the likelihood of a loan default {: #predict-the-likelihood-of-a-loan-default } This page outline...
index
--- title: Demand forecasting description: Learn about an end-to-end demand forecasting use case that uses DataRobot's Python package. --- # Large-scale demand forecasting {: #large-scale-demand-forecasting } The notebooks listed below outline how to performed large-scale demand forecasting using DataRobot's Python ...
index
--- title: Anti-Money Laundering (AML) Alert Scoring description: Build a model that uses historical data, including customer and transactional information, to identify which alerts resulted in a Suspicious Activity Report (SAR). --- # Anti-Money Laundering (AML) Alert Scoring {: #anti-money-laundering-aml-alert-scor...
index
--- title: Predict late shipments description: This page outlines a use case to predict whether a shipment will be late or if there will be a shortage of parts. --- # Predict late shipments {: #predict-late-shipments } This page outlines a use case to predict whether a shipment will be late or if there will be a sho...
index
--- title: Reduce 30-Day readmissions rate description: This page outlines a use case to reduce the 30-day readmission rate at a hospital. This use case is captured in a Jupyter notebook that you can download and execute. --- # Reduce 30-Day readmissions rate {: #reduce-30-day-readmissions-rate } This page outlines ...
index
--- title: Feature selection notebooks description: Review notebooks that outline feature selection. --- # Feature selection notebooks {: #feature-selection-notebooks } DataRobot offers end-to-end code examples via Jupyter notebooks that help you find complete examples of common data science and machine learning wor...
index
--- title: User AI Accelerators description: Review user-submitted workflows, notebooks, and tutorials that help you find complete examples of common data science and machine learning workflows. --- # User AI Accelerators {: #user-ai-accelerators } !!! warning The code in these GitHub repos for these accelerators is...
index
--- title: Troubleshooting Batch Prediction jobs description: A list of common issues that occur with Batch Prediction jobs, and how to resolve them. --- # Troubleshooting {: #troubleshooting } The following lists some common issues and how to resolve them. ## A job is stuck in `INITIALIZING` {: #a-job-is-stuck-in-...
batch-pred-tshoot
--- title: Output formats description: Review the output formats for the predictions DataRobot returns in a columnar table format. --- # Output format {: #output-format } DataRobot returns predictions in a columnar table format. Each example value is followed by the data type it belongs to. The columns returned are d...
output-format
--- title: Prediction output options description: Learn how to program your batch prediction job's output. You can use local file streaming, S3 scoring, an AI Catalog dataset, JDBC, Snowflake, Synapse, or Tableau scoring. --- # Prediction output options {: #prediction-output-options } You can configure a prediction ...
output-options
--- title: Batch Prediction API description: The Batch Prediction API provides flexible options for scoring large datasets using the prediction servers you have already deployed. --- # Batch Prediction API {: #batch-prediction-api } The Batch Prediction API provides flexible options for intake and output when scorin...
index
--- title: Prediction intake options description: For batch prediction data intake, you can use local file streaming, S3 scoring, Azure Blob Storage scoring, Google Cloud Storage scoring, HTTP scoring, an AI Catalog dataset scoring, JDBC scoring, Snowflake scoring, Synapse scoring, or BigQuery scoring. --- # Predicti...
intake-options
--- title: Batch prediction use cases description: Examine several end-to-end examples of scoring with API code for both CSV files and external services. --- # Batch prediction use cases {: #batch-prediction-use-cases } The following provides several end-to-end examples of scoring with API code for both CSV files an...
pred-examples
--- title: Schedule Batch Prediction jobs description: How to create a definition and schedule the execution of a Batch Prediction job. --- # Schedule Batch Prediction jobs {: #schedule-batch-prediction-jobs } After [creating a job definition](job-definitions.md), you can choose to execute job definitions on a sched...
job-scheduling
--- title: Time series description: Outlines how to set up batch predictions for time series models. Includes settings details and code examples. --- # Time series {: #time-series } Batch predictions for time series models work without any additional configuration. However, in most cases you need to either modify th...
batch-pred-ts
--- title: Predictions on large datasets description: Walk through an example of making predictions on a large dataset using the Batch Prediction API. --- # Predictions on large datasets {: #predictions-on-large-datasets } [File size limits](pred-file-limits) vary depending on the prediction method&mdash;for predict...
large-preds-api
--- title: Batch Prediction job definitions description: How to submit a working Batch Prediction job. You must supply a variety of elements to the POST request payload depending on the type of prediction. --- # Batch Prediction job definitions {: #batch-prediction-job-definitions } To submit a working Batch Predict...
job-definitions
--- title: DataRobot REST API description: The DataRobot REST API provides a programmatic alternative to the web interface for creating and managing DataRobot projects. Use the DataRobot API to build highly accurate predictive models and deploy them into production environments. --- # DataRobot REST API {: #datarobot...
index
--- title: Get a prediction server ID dataset_name: N/A description: Learn how to retrieve a prediction server ID using cURL commands from the REST API or by using the DataRobot Python client. domain: DSX expiration_date: 6-20-2022 owner: nathan.goudreault@datarobot.com url: docs.datarobot.com/en/tutorials/using-the-ap...
pred-server-id
--- title: DataRobot Prediction API description: This section describes how to use DataRobot's Prediction API to make predictions on a dedicated prediction server. --- # DataRobot Prediction API {: #datarobot-prediction-api } This section describes how to use DataRobot's Prediction API to make predictions on a dedic...
dr-predapi
--- title: Make predictions with the API description: DataRobot's Prediction API provides a mechanism for using your model for real-time predictions on a prediction server. --- # Make predictions with the API {: #make-predictions-with-the-api } DataRobot's Prediction API provides a mechanism for using your model fo...
index
--- title: Deprecated API routes description: An overview of DataRobot's deprecated Prediction API routes, with a complete list of the specific deprecated POST and GET requests and their replacements. --- # Deprecated API routes {: #deprecated-api-routes } The Prediction API has changed significantly over time and ac...
deprecated-prediction-api
--- title: Time series predictions for deployments description: Make time series predictions for a deployed model. --- # Time series predictions for deployments {: #time-series-predictions-for-deployments } **Endpoint:** `/deployments/<deploymentId>/predictions` Makes time series predictions for a deployed model. ...
time-pred
--- title: Ping health check description: Health check to determine if the service is "alive". --- # Ping health check {: #ping-health-check } **Endpoint:** `/ping` Health check to determine if the service is "alive". **Request Method:** `GET` **Request URL:** deployed URL, example: `https://your-company.orm.data...
ping
--- title: Predictions for deployments description: Using a specified endpoint, calculates predictions based on user-provided data for a specific deployment. --- # Predictions for deployments {: #predictions-for-deployments } Using the endpoint below, you can provide the data necessary to calculate predictions for a...
dep-pred
--- title: Prediction Explanations for deployment description: Using a specified endpoint, makes predictions on a given deployment and provides explanations. --- # Prediction Explanations for deployments {: #prediction-explanations-for-deployments } **Endpoint:** `/deployments/<deploymentId>/predictions?maxExplanati...
dep-predex
--- title: Dedicated Prediction API reference description: This reference provides additional documentation for the Prediction API. It lists the methods, input and output parameters, and errors that the API may return. --- # Dedicated Prediction API reference {: #dedicated-prediction-api-reference } This reference pr...
index
--- title: Predictions for unstructured model deployments description: Using a specified endpoint, calculates predictions based on user-provided data for a specific unstructured model deployment. --- # Predictions for unstructured model deployments {: #predictions-for-unstructured-model-deployments } Using the endpo...
dep-pred-unstructured
--- title: Import data description: DataRobot lets you import data using multiple methods, including uploading dataset files locally, uploading from URLs, and connecting to data sources. --- # Import data {: #import-data } Data can be ingested into DataRobot from your local system, a URL, and through data connection...
index
--- title: Import to DataRobot directly description: This section provides detailed steps for importing without using the catalog, by drag and drop, URL, and HDFS. --- # Import to DataRobot directly {: #import-data-to-DataRobot-directly } This section describes detailed steps for importing data to DataRobot. Before ...
import-to-dr
--- title: Exploratory Data Analysis (EDA) description: EDA is a two-stage process that DataRobot employs to first analyze datasets and summarize their main characteristics and then build models. --- # Exploratory Data Analysis (EDA) {: #exploratory-data-analysis-eda } Exploratory Data Analysis or _EDA_ is DataRobot'...
eda-explained
--- title: Data Quality Assessment description: The Data Quality Assessment automatically detects and often handles data quality issues such as outliers, leading or trailing zeros, target leakage, and many more. --- # Data Quality Assessment {: #data-quality-assessment } The Data Quality Assessment capability automa...
data-quality
--- title: Exploratory Spatial Data Analysis (ESDA) description: DataRobot Location AI provides tools to conduct ESDA. The tools let you interactively visualize and aggregate target, numeric, and categorical features on a map. --- # Exploratory Spatial Data Analysis (ESDA) {: #exploratory-spatial-data-analysis-esda }...
lai-esda
--- title: Feature Associations description: How to navigate and use the Feature Associations tab, which provides a matrix to help you track and visualize associations within your data. --- # Feature Associations {: #feature-associations } Accessed from the **Data** page, the **Feature Associations** tab provides a ...
feature-assoc
--- title: Analyze data description: Interpret the findings and visualizations created by DataRobot. --- # Analyze data {: #analyze-data } These sections help you interpret the findings and visualizations created by DataRobot. Topic | Describes... ----- | ------ [Data Quality Assessment](data-quality) | Understand ...
index
--- title: Over Time chart description: How to use the Over Time chart, which helps identify trends and potential gaps in your data, for all time-aware projects (OTV, single series, and multiseries). --- # Over Time chart {: #over-time-chart} The **Over time** chart helps you identify trends and potential gaps in y...
over-time
--- title: Feature details description: How to work with a feature on the Data page, to view its details and also (in some cases) modify its type. --- # Feature details {: #feature-details } The **Data** page displays tags to indicate a variety of information that DataRobot uncovered while computing EDA1. You can al...
histogram
--- title: Data connections description: To enable integration with a variety of enterprise databases, DataRobot provides a “self-service” JDBC platform for database connectivity setup. --- # DataRobot data connections {: #datarobot-data-connections } !!! note If your database is protected by a network policy that ...
data-conn
--- title: Stored data credentials description: How to add and manage securely stored credentials for reuse in accessing secure data sources. --- # Stored data credentials {: #stored-data-credentials } !!! info "Availability information" This feature is off by default for Self-Managed AI Platform users. To enab...
stored-creds
--- title: Share secure configurations description: Allows IT admins to configure OAuth-based authentication parameters for a data connection, and then securely share them with other users without exposing sensitive fields. --- # Share secure configurations {: #share-secure-configurations } IT admins can configure OA...
secure-config
--- title: Data connections description: Connect to various data sources and manage stored credentials. --- # Data connections {: #data-connections } The AI Catalog is a browsable and searchable collection of registered objects that contains definitions and relationships between various object types. These definitio...
index
--- title: Automatic transformations description: Learn about DataRobot's automatic transformations. Transformed features do not replace the original features, but are added as new features for building models. --- # Automatic transformations {: #automatic-transformations } The following sections describe DataRobot'...
auto-transform
--- title: Manual transformations description: Manually create feature transformations using the natural logarithm, squaring, running functions on numeric data, or changing the variable type, if appropriate. --- # Manual transformations {: #manual-transformations } The following sections describe manual, user-create...
feature-transforms
--- title: Interaction-based transformations description: Without a secondary dataset, Feature Discovery does not run. But with settings you can automatically create features using interactions in your primary dataset. --- # Interactions-based transformations {: #interaction-based-transformations } If your project h...
feature-disc
--- title: Transform data description: Perform transformations and feature discovery using DataRobot's feature engineering tools. --- # Transform data {: #transform-data } DataRobot supports multiple methods of feature engineering&mdash;automatic and manual feature transformations for single datasets, as well as Fea...
index
--- title: Prepare data with Spark SQL description: On the Add menu, Prepare data with Spark SQL lets you prepare a new dataset from a single dataset or blend several datasets using a Spark SQL query. --- # Prepare data in AI Catalog with Spark SQL {: #prepare-data-in-ai-catalog-with-spark-sql } Using **Prepare data...
spark
--- title: Schedule snapshots description: To keep a dataset in sync with the data source, you can schedule snapshots at specified intervals through the AI Catalog. --- # Schedule snapshots in the AI Catalog {: #schedule-snapshots-in-the-ai-catalog } !!! info "Availability information" The **AI Catalog** must b...
snapshot
--- title: Work with catalog assets description: How to find, share, and delete data assets in the AI Catalog, work with metadata and feature lists, view relationships and version history, download datasets. --- # Work with AI Catalog assets {: #work-with-ai-catalog-assets } Data assets within the AI Catalog can be ...
catalog-asset
--- title: AI Catalog description: DataRobot's AI Catalog is a collection of registered objects that contains definitions and relationships between various object types. --- # AI Catalog {: #import-data } The AI Catalog is a browsable and searchable collection of registered objects that contains definitions and rela...
index
--- title: Load data and create projects description: From the AI Catalog, you can add external data using JDBC or a SQL query, create a snapshot, configure fast registration, upload calendars for time series projects, then create a project. --- # Import and create projects in the AI Catalog {: #import-and-create-pro...
catalog
--- title: Creating and managing data connectors description: Add data connectors to DataRobot and create a data connection from a connector. section_name: Data maturity: public-preview --- # Creating and managing data connectors {: #creating-and-managing-data-connectors } !!! info "Availability information" Crea...
connector
--- title: Create feature lists in the Relationship Editor description: Create custom feature lists and transform features in DataRobot's Relationship Editor. section_name: Data maturity: public-preview --- # Create feature lists in the Relationship Editor {: #create-feature-lists-in-the-relationship-editor } !!! inf...
safer-rel-editor-feature-lists
--- title: AI Catalog impact analysis description: View additional details that show how AI Catalog entities in the DataRobot application are related to--or dependent on--the current asset. section_name: Data maturity: public-preview --- # AI Catalog impact analysis {: #ai-catalog-impact-analysis } !!! info "Availabi...
catalog-impact
--- title: Data connection UI improvements description: Set up data connections in DataRobot using a more intuitive workflow. section_name: Data maturity: public-preview --- # Data connection UI improvements {: #data-connection-ui-improvements } !!! info "Availability information" The new data connection page is ...
new-data-ui
--- title: BigQuery connection enhancements description: A new BigQuery connector is now available for public preview, providing several performance and compatibility enhancements, as well as support for authentication using Service Account credentials. section_name: Data maturity: public-preview --- # BigQuery connec...
bigq-service-acct
--- title: Data public preview features description: Read preliminary documentation for data-related features currently in the DataRobot public preview pipeline. section_name: Data maturity: public-preview --- # Data public preview features {: #data-public-preview-features } {% include 'includes/pub-preview-notice-in...
index
--- title: Governance workflow support for Feature Discovery deployments description: DataRobot's Feature Discovery projects support the deployment approval workflow to control updates to secondary dataset configurations. section_name: Data maturity: public-preview --- # Governance workflow support for Feature Discove...
feat-disc-workflow
--- title: Snowflake key pair authentication description: Connect to Snowflake using key pair authentication. section_name: Data maturity: public-preview --- # Snowflake key pair authentication {: #snowflake-key-pair-authentication } !!! info "Availability information" Key pair authentication for Snowflake is off...
snow-keypair
--- title: Relationship Quality Assessment speed improvements description: Receive faster results from the Relationship Quality Assessment in Feature Discovery projects. section_name: Data maturity: public-preview --- # Relationship Quality Assessment speed improvements {: relationship-quality-assessment-speed-improve...
rqa-speed
--- title: Fast EDA for large datasets description: Overview of Fast Exploratory Data Analysis (EDA) for large datasets, and how to apply early target selection. --- # Early target selection {: #early-target-selection } The data ingestion process for large datasets can, optionally, be different than that used for s...
fast-eda
--- title: Work with large datasets description: This section provides information on working with large datasets. Consider Fast EDA for large sets up to 10GB; use scalable ingest for sets up to 100GB. --- # Large datasets {: #large-datasets } The following sections provide additional information on working with lar...
index
# Microsoft SQL Server {: #microsoft-sql-server } ## Supported authentication {: #supported-authentication } - Username/password ## Prerequisites {: #prerequisites } The following is required before connecting to Microsoft SQL Server in DataRobot: - Microsoft SQL account ## Required parameters {: #required-parame...
dc-ms-sql-srvr
# Snowflake {: #snowflake } ## Supported authentication {: #supported-authentication } - [Username/password](#username-password) - [Snowflake OAuth](#snowflake-oauth) - [External OAuth](#snowflake-external-oauth) with Okta or AzureAD ## Username/password {: #username-password} ### Prerequisites {: #prerequisites } ...
dc-snowflake
# Microsoft Azure {: #microsoft-azure } ## Supported authentication {: #supported-authentication } - Azure SQL Server/Synapse username/password - Active Directory username/password ## Prerequisites {: #prerequisites } The following is required before connecting to Azure in DataRobot: - Azure SQL account ## Requir...
dc-azure
# Treasure Data {: #treasure-data } ## Supported authentication {: #supported-authentication } - Username/password ## Prerequisites {: #prerequisites } The following is required before connecting to Treasure Data (TD-Hive) in DataRobot: - Treasure Data account ## Required parameters {: #required-parameters } The...
dc-treasure
# PostgreSQL {: #postgresql } ## Supported authentication {: #supported-authentication } - Username/password ## Prerequisites {: #prerequisites } The following is required before connecting to PostgreSQL in DataRobot: - PostgreSQL account ## Required parameters {: #required-parameters } The table below lists the...
dc-postgresql