ACTIFIO DATA DRIVEN SOLUTION
Data Readiness for Artificial Intelligence, Machine Learning, and Analytics.
Instant setup and teardown of centralized data for analytics via self service, on-premise or in the cloud.
Data Scientists, Analysts, and Database Administrators (DBAs) create many copies of production data for a range of analysis use cases. And increasingly, these data environments must be spun up (and torn down) rapidly, regardless of data size.
Actifio’s software platform delivers a new level of data agility, managing data throughout its lifecycle, and providing instant access to virtual full data images, on-premises or in any cloud.
“If we get our development environment in two hours vs. two days, we could measure success in the cost of developer’s time. But the real benefit is so much more than just fast. There’s improved quality, speed to market, and getting a patch tomorrow rather than waiting ‘til Friday is another way to support customer satisfaction.”
Mark Moseley – Vice President of IT
Legacy solutions struggle to meet the needs of the evolving technology landscape.
DBAs typically use legacy backup tools to provision analytics sandbox/test environments. Due to the inherent inefficiencies in restore processes, particularly when accessed from tape, these environments are provisioned infrequently, and updated even less often.
To overcome this time problem, DBAs often create physical copies of production databases. This leads to increased storage costs, particularly for large databases. It also means long wait cycles, with significant time wasted creating and refreshing physical copies.
Additionally, in many environments, users can’t perform simultaneous testing due to under-provisioned infrastructure (compute and memory in addition to storage). And most of these point tools are not cloud-ready.
Addressing these challenges means that DBAs spend more time on mundane database cloning and management tasks, decreasing their efficiency, and taking time away from more business-critical tasks.
1 / 4
HOW Actifio HELPS
Actifio helps accelerate data preparation for analytics, model testing, and more by providing data users self-service access to instantly provision multi-TB data images.
The result is increased efficiency and significantly reduced strain on a DBA’s time and resources.
Also, because these images are rewritable thin-clones, they don’t consume any extra storage, which helps reduce overall infrastructure costs.
Actifio’s ability to integrate via APIs to data transformation tools and provision dozens of database clones to test environments enables analysts and data scientists to rapidly test and tune through their models.
And users can enjoy all these benefits while leveraging Actifio on-premises or in any public cloud.
Frequently Asked Questions.
Actifio’s pricing model is very simple and easy to understand. It’s based on the amount of source data managed. For example: If an enterprise wants to use Actifio to provision 20 clones of a 10 TB database, the enterprise would need Actifio license for 10 TB of source data. Enterprises can create as many clones as they want and can retain any amount of point-in-time history. Enterprises can also use this license anywhere in the public cloud or on-premises.
Actifio VDP is the patented core Virtual Data Pipeline technology that can be consumed in multiple form factors.
Actifio Sky is a VM appliance that runs Actifio VDP engine and can be provisioned in a public cloud VM or on-premises in VMware, Hyper-V VMs.
Actifio CDS/CDX is a physical appliance cluster that runs Actifio VDP engine with High Availability and is typically used on-premises.
Yes. Actifio supports all major public cloud providers including AWS, Azure, Google, IBM, and Oracle. It’s also available in some of the public cloud marketplaces.
Yes, Actifio leverages native Block Change Tracking (BCT) technology to capture just the changed blocks in its incremental forever data ingestion. After the first backup, which is an Image Copy, Actifio does an incremental backup and incremental merge.
Thus, if 5% of a 10 TB production database changes, only 500 GB of data will be ingested by Actifio. When testers have to refresh, they just unmount their database clones and provision clones again in just minutes.
Yes. SLAs be set in such a way that transaction/archive logs can be copied to Actifio Sky instance every X minutes or hours in between the incremental data ingestion. For example, a user can setup incremental data ingestion every 1 hour and log copies every 15 minutes.
While provisioning virtual database clones, a user can specify any point-in-time. Actifio automatically identifies the nearest incremental point-in-time, mounts a synthetic virtual full copy instantly as of that point-in-time, and applies the transaction/archive logs to recover the database to the specified point-in-time. All of this is fully automated.
Actifio stores the database backups in native format. This ensures that after instant mount and provisioning a database clone, there is no performance overhead because of format conversion.
The other factor to consider is the storage on which the database backups are stored by Actifio.
Depending on the performance requirements, enterprises can specify the right storage tier to use with Actifio.
And lastly, Actifio also offers the flexibility to instantly mount and provision database clones over fiber channel or iSCSI or NFS depending upon the user preference.
Thus the performance can be as good as the underlying storage and the protocol the user wants to use.
Yes. This is possible with two approaches.
In the first approach, the enterprise can use Commvault/Netbackup to backup databases from the primary database instance, and Actifio can ingest data from the standby database instance or vice versa.
In the second approach where Commvault/Netbackup and Actifio are forced to protect from the same database instance, Actifio can still ingest data for Test Data Management in an incremental forever manner using Block Change Tracking (BCT) without impacting Commvault/Veritas backups. But only one product can manage the Archive/Transaction log backups.