Benchmark of IBM Infosphere Virtual Data Pipeline, IBM Cloud Object Storage and Actifio
Chandra Reddy: (00:04)
Hi, I’m Chandra Reddy, VP of product marketing at Actifio. I’m excited to have Brian from ESG and Anu from IBM here. Anu, can you talk about a little bit about what is this project about the ESG validation report for IBM VDP and IBM costs?
Anu Khera: (00:21)
Yes, absolutely, Chandra. So we worked with Actifio as well as ESG, where ESG did a benchmark, with IBM COS and IBM VDP, which is the IBM infrastructure virtual data pipeline, and is in this part of the effort. We took a real life database which was deployed in an Oracle RMAN rack in an IBM cloud data center in Dallas. We took that and took a snapshot of it, a moment in time and had that compressed into Actifio VDP and that was the use case that we had in the Dallas data center.
Chandra Reddy: (00:59)
Great. So you took a copy in one data center of IBM, and then what did you do with the copy?
Anu Khera: (01:05)
So the intent was to see what we can do with that block storage in IBM VDP that it has a capture of that database. So basically that was sent over the standard pipe, 10 gbps pipe over to IBM cloud service in Washington DC. From there we were able to have VDP be our way to enable five clones of the virtual images that could be accessed by application developers to do their testing on these virtual images that are taking no additional capacity. So this again is the 17 terabyte gold copy, that is residing an IBM COS, enabled by the IBM VDP, enabling no additional space and allowing the developers to be able to do read and writes directly with these virtual copies, which are not taking any space and being able to do all of the performance querying and testing that they need to do todevelop their applications.
Chandra Reddy: (02:10)
Awesome. Great. So you took a 50 terabyte Oracle database bagged it up, replicated it to IBM COS and from IBM COS you provision multiple database clones, it seems.
Anu Khera: (02:19)
Chandra Reddy: (02:20)
All right. And Brian, how were your findings? What were the findings?
Brian Garrett: (02:23)
We wanted to verify that this process was easy and policy-based so that was easy and that it was fast and affordable and it actually checked the box on each of those. From the “easy” standpoint, everything is very policy-based. The GUI (graphical user interface) is great and there’s restful API so you can use API magic. The time they would take using traditional methods, we estimated it would take up to 14 days actually making full volume copies. That shrinks down to about an hour, a little bit over an hour with the virtual copies are almost instantaneous. But based on policy definitions for those developers, we’re able to spin up the Oracle environment for them.That took a little bit of time and get that point in time image for them ready to go. We were making copies every hour. So it’s very easy and policy based. Next, we next we wanted to see what’s the performance. It’s not obvious that you take a mission critical database that’s used to running on block, usually mission critical, best in class storage and put it on cloud object. The perception is not going to be fast. We’re actually pleasantly surprised. We ran the exact same queries, both simple and complex on 600 million rows took about two minutes to do a simple query. It took about eight minutes to do a complex query on COS, which is plenty good enough in our opinion for functional test development. We’re actually pleasantly surprised. It was actually faster on the cloud object storage. Last but at least, we looked at the economics of the situation and mostly due to the savings due to COS storage it was 79% less compared to legacy or traditional methods.
Chandra Reddy: (03:51)
Awesome. So 79% savings. What is the biggest saving component in that?
Brian Garrett: (03:55)
It’s the cost of the storage through where the copies are going to live. Right? So that the old ways of doing things, you would mirror, you know, mission critical storage to mission critical storage and it needs to be the same. That’s the old way. And so that second copy usually cost you as much. In this case, we’re able to take that and use the virtual data pipeline technology powered by Actifio to make that transparently use object, which is much, much more affordable. You can buy that for pennies per gig per month. That’s what we did for this test, but also could be deployed on prem if you’d like. There’s a software only version of it that you can do.
Chandra Reddy: (04:28)
So to summarize, five virtual copies of a 50 terabyte Oracle database was provisioned straight out of IBM COS, and that’s where you saw the time shrink and the cost rink.
Brian Garrett: (04:39)
Chandra Reddy: (04:40)
And the queries ran for 600 million rows in just two minutes.
Brian Garrett: (04:44)
Chandra Reddy: (04:44)
Straight out of the IBM COS.
Brian Garrett: (04:46)
Chandra Reddy: (04:46)
Awesome! Thank you for those results. They sound very exciting. Thank you, Anu. Thank you Brian. Thank you all.