Backup and Replication software have received strong mandates in organizations with enterprise mindsets and vision. But lower down the rung, small medium organizations are less invested in backup and replication software. These organizations know full well that they must backup, replicate and protect their servers, physical and virtual, and also new workloads in the clouds, given the threat of security breaches and ransomware is looming larger and larger all the time. But many are often put off by the cost of implementing and deploying a Backup and Replication software.
So I explored one of the lesser known backup and recovery software called Nakivo®Backup and Replication(NBR) and took the opportunity to build a backup and replication appliance in my homelab with TrueNAS®. My objective was to create a cost effective option for small medium organizations to enjoy enterprise-grade protection and recovery without the hefty price tag.
This blog, Part 1, writes about the architecture overview of Nakivo® and the installation of the NBR software in TrueNAS® to bake in and create the concept of a backup and replication appliance. Part 2, in a future blog post, will cover the administrative and operations usage of NBR.
Early in the year, I wrote about NAS systems being a high impact target for ransomware. I called NAS a goldmine for ransomware. This is still very true because NAS systems are the workhorses of many organizations. They serve files and folders and from it, the sharing and collaboration of Work.
Another common function for NAS systems is being a target for backups. In small medium organizations, backup software often direct their backups to a network drive in the network. Even for larger enterprise customers too, NAS is the common destination for backups.
Typical NAS backup for small medium organizations.
Backup to Data Domain with NAS (NFS, CIFS) Protocols
Ransomware is obviously targeting the backup as another high impact target, with the potential to disrupt the rescue and the restoration of the work files and folders.
[Preamble: I have been invited by GestaltIT as a delegate to their Tech Field Day for Storage Field Day 18 from Feb 27-Mar 1, 2019 in the Silicon Valley USA. My expenses, travel and accommodation were covered by GestaltIT, the organizer and I was not obligated to blog or promote their technologies presented at this event. The content of this blog is of my own opinions and views]
The NetApp Data Fabric Vision
The NetApp Data Fabric vision has always been clear to me. Maybe it was because of my 2 stints with them, and I got well soaked in their culture. 3 simple points define the vision.
The Data Fabric is THE data singularity. Data can be anywhere – on-premises, the clouds, and more.
Have bridges, paths and workflows management to the Data, to move the data to wherever the data may be.
Work with technology partners to build tools and data systems to elevate the value of the data
The NDAS feature is only available with ONTAP 9.5. With less than 5 clicks, data from ONTAP primary systems can be backed up to the secondary ONTAP target (running the NDAS proxy and the Copy to Cloud API), and then to AWS S3 buckets in the cloud.
Admit it! You are a terabyte junkie! I am sure many of us have one terabyte or more of your personal “stuff” at home. Heck, I even heard from a friend that he has almost 20TB of high definition movies (thank you Torrent!) at home! That’s crazy!
And what the typical Malaysian consumer would do after he or she runs out of hard disk space? In KL (our beloved capital city, Kuala Lumpur), they would throng the Low Yat IT mall or extensions of it, like Digital Mall in PJ Section 14. In other towns and cities in Malaysia, PC fairs are popular, as consumers try to get the best price possible (We Malaysian are good at squeezing the max of a deal)
It is difficult for the not-so-IT-literate consumer to differentiate which brand is the best. Buffalo, Iomega, DLink, Western Digital, etc, etc. But the tides are changing, because these vendors want to tie you down for the rest of your digital life. You see, buying a small NAS for the home now comes with a big carrot, an incentive to keep you wanting for more, and yet you can’t unbind yourself from the tether once you are hooked.
Cloud storage hasn’t taken off in a big way last year. But many cloud storage vendors know there are plenty of opportunities out there but how do they get the consumers to upload their files, photos and whatever stuff they might have, to cloud storage? Ingeniously, they work together with other smaller NAS storage players and use these vendor’s product offerings as baits. They bundle a significantly large FREE capacity or data protection offering in the Cloud Storage as the carrot, and once the consumer decides to put their files in the cloud storage, boom, they are ensnared to become a long term ATM machine to the Cloud Storage Provider.
Sneaky? No? I call this good, smart marketing. You have a market of opportunities out there, but cloud storage isn’t catching on. You have small NAS vendors that is reaching out to the market of consumer, but it’s a brutal, competitive arena and margins are razor thin. It’s a win-win situation for both sides.
This was moving towards a market that scratches the itch. The consumers wanted reliable backup too, but consumer-grade disk drives fail ever so often. Laptops get stolen, and files could be infected by viruses. The list goes on, but the point is that the Cloud Storage Providers may have found a silver lining in getting the consumers to leap into the cloud. And the whole idea of small NAS vendor-big Cloud Backup dynamic duo, just got a big endorsement last night. Guess who has decided to dip its grubby hands into the pie?
EMC, the 800-pound gorilla of the information and storage world, through its Iomega subsidiary, wants your money! They had just married Iomega with EMC Atmos. It was quoted:
“EMC subsidiary and data protection specialist Iomega announced the integration between Iomega network storage solutions and EMC Atmos, extending Atmos cloud-based data protection and sharing to Iomega’s network storage product offerings. The new integration gives small and midsize businesses (SMBs), remote offices and distributed enterprises access to any Atmos powered cloud around the world.”
Surprised? Not really, but I guess EMC needs to breath new life into Atmos and this marriage just extended Atmos’ life support system.
I like the way Amazon is building their Cloud Computing services. Amazon Web Services (AWS) is certainly on track to become the most powerful Cloud Computing company in the world. In fact, AWS might already is. But they are certainly not resting on their laurels when they launched 2 new services in as many weeks – Amazon DynamoDB (last week) and Amazon Storage Gateway (this week).
I am particularly interested in the Amazon Storage Gateway, because it is addressing one of the biggest fears of Cloud Computing head-on. A lot of large corporations are still adamant to keep their data on-premise where it is private and secure. Many large corporations are still very skeptical about it even though Cloud Computing is changing the IT landscape in a massive way. The barrier to entry for large corporations is not something easy, but Amazon is adapting to get more IT divisions and departments to try out Cloud Computing in a less disruptive way.
The new service, is really about data storage and data backup for large corporations. This is important because large corporations have plenty of requirements for data storage and data to be backed up. And as we know, a large portion of the data stored does not need to be transactional or to be accessed frequently. This set of data is usually less frequently used, for archiving or regulatory compliance reasons, particular in the banking and healthcare industry.
In the data backup operations, the reason data is backed up is to provide a data recovery mechanism when a disaster strikes. Large corporations back up tons of data every day, weeks or month and this data only has value when there is a situation that requires data relevance, data immediacy or data recovery. Otherwise, it is just plenty of data taking up storage space, be it on disk or on tape.
Both data storage and data backup cost a lot of money, both CAPEX and OPEX. In CAPEX, you are constantly pressured to buy more storage to store the ever growing data. This leads to greater management and administration costs, both contributing heavily into OPEX costs. And I have not included the OPEX costs of floor space, power and cooling, people (training, salary, time and so on) typically adding up to 3-5x the operations costs relative to the capital investments. Such a model of IT operations related to storage cannot continue forever, and storage in the Cloud offers an alternative.
These 2 scenarios – data storage and data backup – are exactly the type of market AWS is targeting. In order to simplify and pacify large corporations, AWS introduced the Amazon Storage Gateway, that eases the large corporations to take some of their IT storage operations to the Cloud in the form of Amazon S3.
The video below shows the Amazon Storage Gateway:
The Amazon Storage Gateway is a piece of software “appliance” that is installed on-premise in the large corporation’s data center. It seamlessly integrates into the LAN and provides a SSL (Secure Socket Layer) connection to the Amazon S3. The data being transferred to the S3 is also encrypted with AES (Advanced Encryption Standard) 256-bit. Both SSL and AES-256 can give customers a sense of security and AWS claims that the implementation meets the data storage and data recovery standards used in the banking and healthcare industries.
The data storage and backup service regularly protects the customer’s data in snapshots, and giving the customer a rapid recovery platform should the customer experienced on-premise data corruption or data disruption. At the same time, the snapshot copies in the Amazon S3 can also be uploaded into Amazon EBS (Elastic Block Store) and testing or development environments can be evaluated and testing with Amazon EC2 (Elastic Compute Cloud). The simplicity of sharing and combining different Amazon services will no doubt, give customers a peace of mind, easing their adoption of Cloud Computing with AWS.
This new service starts with a 60-day free trial and moving on to a USD$125.00 (about Malaysian Ringgit $400.00) per gateway per month subscription fee. The data storage (inclusive of the backup service), costs only 14 cents per gigabyte per month. For 1TB of data, that is approximately MYR$450 per month. Therefore, minus the initial setup costs, that comes to a total of MYR$850 per month, slightly over MYR$10,000 per year.
At this point, I like to relate an experience I had a year ago when implementing a so-called private cloud for an oil-and-gas customers in KL. They were using the HP EVS (Electronic Vaulting Service) to an undisclosed HP data center hosting site in the Klang Valley. The HP EVS, which was an OEM of Asigra, was not an easy solution to implement but what was more perplexing was the fact that the customer had a poor understanding of what would be the objectives and their 5-year plan in keeping with the data protected.
When the first 3-4TB data storage and backup were almost used up, the customer asked for a quotation for an additional 1TB of the EVS solution. The subscription for 1TB was MYR$70,000 per year. That is 7x time more than the AWS MYR$10,000 per year cost! I have to salute the HP sales rep. It must have been a damn good convincing sell!
In the long run, the customer could be better off running their storage and backup on-premise with their HP EVA4400 and adding an additional of 1TB (and hiring another IT administrator) would have cost a whole lot less.
Amazon Web Services has already operating in Singapore for the past 2 years, and I am sure they are eyeing Malaysia as their regional market. Unless and until Malaysian companies offering Cloud Services know to use economies-of-scale to capitalize the Cloud Computing market, AWS is always going to be a big threat to CSP companies in Malaysia and a boon of any companies seeking cloud computing services anywhere in the world.
I urge customers in Malaysia to start questioning their so-called Cloud Service Providers if they can do what AWS is doing. I have low confidence of what the most local “cloud computing” companies can deliver right now. I hope they stop window dressing their service offerings and start giving real cloud computing services to customers. And for customers, you must continue to research and find out more which cloud services meet your business objectives. Don’t be flashed by the fancy jargons or technical idealism thrown at you. Always, always find out more because your business cost is at stake. Don’t be like the customer who paid MYR$70,000 for 1TB per year.
AWS is always innovating and the Amazon Storage Gateway is just another easy-to-adopt step in their quest for world domination.