Building Trust in the Storage Brand

Trust is everything. When done right, the brand is trust.

One Wikibon article last month “Does Hardware (still) Matter?” touched on my sentiments and hit close to the heart. As the world becomes more and more data driven and cloud-centric, the prominence of IT infrastructure has diminished from the purview of the boardroom. The importance of IT infrastructure cannot be discounted but in this new age, storage infrastructure has become invisible.

In the seas of both on-premises and hybrid storage technology solutions, everyone is trying to stand out, trying to eke the minutest ounces of differentiation and advantage to gain the customer’s micro-attention. With all the drum beatings, the loyalty of the customer can switch in an instance unless we build trust.

I ponder a few storage industry variables that help build trust.

Open source Communities and tribes

During the hey-days of proprietary software and OSes, protectionism was key to guarding the differentiations and the advantages. Licenses were common, and some were paired with the hardware hostid to create that “power combination”. And who can forget those serial dongles license keys? Urgh!!

Since the open source movement (Read The Cathedral and the Bazaar publication) began, the IT world has begun to trust software and OSes more and more. Open Source communities grew and technology tribes were formed in all types of niches, including storage software. Trust grew because the population of the communities kept the vendors honest. Gone are the days of the Evil Empire. Even Microsoft® became a ‘cool kid’.

TRUST

One open source storage filesystem I worked extensively on is OpenZFS. From its beginnings after Open Solaris® (remember build 134), becoming part of the Illumos project and then later in FreeBSD® and Linux upstream. Trust in OpenZFS was developed over time because of the open source model. It has spawned many storage projects including FreeNAS™ which later became TrueNAS®.

Continue reading

The young report card on Decentralized Storage

I kept this blog in my queue for over 4 months. I was reluctant to publish it because I thought the outrageous frenzies of NFTs (non-fungible tokens), metaverses and web3 were convoluting the discussions on the decentralized storage topic. 3 weeks back, a Google Trends search for these 3 opaque terms over 90 days showed that the worldwide fads were waning. Here was the Google Trends output on April 2, 2022:

Google Trends on NFT, metaverse and web3

Decentralized storage intrigues me. I like to believe in its potential and I often try to talk to people to strengthen the narratives, and support its adoption where it fits. But often, the real objectives of decentralized storage are obfuscated by the polarized conversations about cryptocurrencies that are pegged to their offerings, NFTs (non-fungible tokens), DAOs (decentralized autonomous organizations) and plenty of hyperboles with bewildering facts as well.

But I continue to seek sustainable conversations about decentralized storage without the sway of the NFTs or the cryptos. After dipping in my toes and experiencing with HODLers, and looking at the return to sanity, I believe we can discuss decentralized storage with better clarity now. The context is to position decentralized storage to the mainstream, specifically to business organizations already immersed in centralized storage. Here is my fledgling report card on decentralized storage.

Continue reading

Backup – Lest we forget

World Backup Day – March 31st

Last week was World Backup Day. It is on March 31st every year so that you don’t lose your data and become an April’s Fool the next day.

Amidst the growing awareness of the importance of backup, no thanks to the ever growing destructive nature of ransomware, it is important to look into other aspects of data protection – both a data backup/recovery and a data security –  point of view as well.

3-2-1 Rule, A-B-C and Air Gaps

I highlighted the basic 3-2-1 rule before. This must always be paired with a set of practised processes and policies to cultivate all stakeholders (aka the people) in the organization to understand the importance of protecting the data and ensuring data recoverability.

The A-B-C is to look at the production dataset and decide if the data should be stored in the Tier 1 storage. In most cases, the data becomes less active and these datasets may be good candidates to be archived. Once archived, the production dataset is smaller and data backup operations become lighter, faster and have positive causation as well.

Air gaps have returned to prominence since the heightened threats on data in recent years. The threats have pushed organizations to consider doing data offsite and offline with air gaps. Cost considerations and speed of recovery can be of concerns, and logical air gaps are also gaining style as an acceptable extra layer of data. protection.

Backup is not total Data Protection cyberdefence

If we view data protection more holistically and comprehensively, backup (and recovery) is not the total data protection solution. We must ignore the fancy rhetorics of the technology marketers that backup is the solution to ensure data protection because there is much more than that.

The well respected NIST (National Institute of Standards and Technology) Cybersecurity Framework places Recovery (along with backup) as the last pillar of its framework.

NIST Cybersecurity Framework

Continue reading

Nakivo Backup Replication architecture and installation on TrueNAS – Part 1

Backup and Replication software have received strong mandates in organizations with enterprise mindsets and vision. But lower down the rung, small medium organizations are less invested in backup and replication software. These organizations know full well that they must backup, replicate and protect their servers, physical and virtual, and also new workloads in the clouds, given the threat of security breaches and ransomware is looming larger and larger all the time. But many are often put off by the cost of implementing and deploying a Backup and Replication software.

So I explored one of the lesser known backup and recovery software called Nakivo® Backup and Replication (NBR) and took the opportunity to build a backup and replication appliance in my homelab with TrueNAS®. My objective was to create a cost effective option for small medium organizations to enjoy enterprise-grade protection and recovery without the hefty price tag.

This blog, Part 1, writes about the architecture overview of Nakivo® and the installation of the NBR software in TrueNAS® to bake in and create the concept of a backup and replication appliance. Part 2, in a future blog post, will cover the administrative and operations usage of NBR.

Continue reading

IT Data practices and policies still wanting

There is an apt and honest editorial cartoon about Change.

From https://commons.wikimedia.org/wiki/File:Who-Wants-Change-Crowd-Change-Management-Yellow.png

I was a guest of Channel Asia Executive Roundtable last week. I joined several luminaries in South East Asia to discuss about the topic of “How Partners can bring value to the businesses to manage their remote workforce“.

Covid-19 decimated what we knew as work in general. The world had to pivot and now, 2+ years later, a hybrid workforce has emerged. The mixture of remote work, work-from-home (WFH), physical office and everywhere else has brought up a new mindset and new attitudes with both the employers and their staff alike. Without a doubt, the remote way of working is here to stay.

People won but did the process lose?

The knee jerk reactions when the lockdowns of Covid hit were to switch work to remote access to applications on premises or in the clouds. Many companies have already moved to the software-as-a-service (SaaS) way of working but not all have made the jump, just like not all the companies’ applications were SaaS based. Of course, the first thing these stranded companies do was to look for the technologies to solve this unforeseen disorder.

People Process Technology.
Picture from https://iconstruct.com/blog/people-process-technology/

Continue reading

Please cultivate 3-2-1 and A-B-C of Data Management

My Sunday morning was muddled 2 weeks ago. There was a frenetic call from someone whom I knew a while back and he needed some advice. Turned out that his company’s files were encrypted and the “backups” (more on this later) were gone. With some detective work, I found that their files were stored in a Synology® NAS, often accessed via QuickConnect remotely, and “backed up” to Microsoft® Azure. I put “Backup” in inverted commas because their definition of “backup” was using Synology®’s Cloud Sync to Azure. It is not a true backup but a file synchronization service that often mislabeled as a data protection backup service.

All of his company’s projects files were encrypted and there were no backups to recover from. It was a typical ransomware cluster F crime scene.

I would have gloated because many of small medium businesses like his take a very poor and lackadaisical attitude towards good data management practices. No use crying over spilled milk when prevention is better than cure. But instead of investing early in the prevention, the cure would likely be 3x more expensive. And in this case, he wanted to use Deloitte® recovery services, which I did not know existed. Good luck with the recovery was all I said to him after my Sunday morning was made topsy turvy of sorts.

NAS is the ransomware goldmine

I have said it before and I am saying it again. NAS devices, especially the consumer and prosumer brands, are easy pickings because there was little attention paid to implement a good data management practice either by the respective vendor or the end users themselves. 2 years ago I was already seeing a consistent pattern of the heightened ransomware attacks on NAS devices, especially the NAS devices that proliferated the small medium businesses market segment.

The WFH (work from home) practice trigged by the Covid-19 pandemic has made NAS devices essential for businesses. NAS are the workhorses of many businesses after all.  The ease of connecting from anywhere with features similar to the Synology® QuickConnect I mentioned earlier, or through VPNs (virtual private networks), or a self created port forwarding (for those who wants to save a quick buck [ sarcasm ]), opened the doors to bad actors and easy ransomware incursions. Good data management practices are often sidestepped or ignored in exchange for simplicity, convenience, and trying to save foolish dollars. Until ….

Continue reading

Time to Conflate Storage with Data Services

Around the year 2016, I started to put together a better structure to explain storage infrastructure. I started using the word Data Services Platform before what it is today. And I formed a pictorial scaffold to depict what I wanted to share. This was what I made at that time.

Data Services Platform (circa 2016)- Copyright Heoh Chin Fah

One of the reasons I am bringing this up again is many of the end users and resellers still look at storage from the perspective of capacity, performance and price. And as if two plus two equals five, many storage pre-sales and architects reciprocate with the same type of responses that led to the deteriorated views of the storage technology infrastructure industry as a whole. This situation irks me. A lot.

Continue reading

Rethinking data processing frameworks systems in real time

“Row, row, row your boat, gently down the stream…”

Except the stream isn’t gentle at all in the data processing’s new context.

For many of us in the storage infrastructure and data management world, the well known framework is storing and retrieve data from a storage media. That media could be a disk-based storage array, a tape, or some cloud storage where the storage media is abstracted from the users and the applications. The model of post processing the data after the data has safely and persistently stored on that media is a well understood and a mature one. Users, applications and workloads (A&W) process this data in its resting phase, retrieve it, work on it, and write it back to the resting phase again.

There is another model of data processing that has been bubbling over the years and now reaching a boiling point. Still it has not reached its apex yet. This is processing the data in flight, while it is still flowing as it passes through processing engine. The nature of this kind of data is described in one 2018 conference I chanced upon a year ago.

letgo marketplace processing numbers in 2018

  • * NRT = near real time

From a storage technology infrastructure perspective, this kind of data processing piqued my curiosity immensely. And I have been studying this burgeoning new data processing model in my spare time, and where it fits, bringing the understanding back into the storage infrastructure and data management side.

Continue reading

How well do you know your data and the storage platform that processes the data

Last week was consumed by many conversations on this topic. I was quite jaded, really. Unfortunately many still take a very simplistic view of all the storage technology, or should I say over-marketing of the storage technology. So much so that the end users make incredible assumptions of the benefits of a storage array or software defined storage platform or even cloud storage. And too often caveats of turning on a feature and tuning a configuration to the max are discarded or neglected. Regards for good storage and data management best practices? What’s that?

I share some of my thoughts handling conversations like these and try to set the right expectations rather than overhype a feature or a function in the data storage services.

Complex data networks and the storage services that serve it

I/O Characteristics

Applications and workloads (A&W) read and write from the data storage services platforms. These could be local DAS (direct access storage), network storage arrays in SAN and NAS, and now objects, or from cloud storage services. Regardless of structured or unstructured data, different A&Ws have different behavioural I/O patterns in accessing data from storage. Therefore storage has to be configured at best to match these patterns, so that it can perform optimally for these A&Ws. Without going into deep details, here are a few to think about:

  • Random and Sequential patterns
  • Block sizes of these A&Ws ranging from typically 4K to 1024K.
  • Causal effects of synchronous and asynchronous I/Os to and from the storage

Continue reading

At the mercy of the cloud deity

Amazon Web Services (AWS) went down in the middle of last week. News of the outage were mentioned:

AWS Management Console unavailable error

Piling the misery

The AWS outage headlines attract the naysayers, the fickle armchair pundits, and the opportunists. Here are a few news articles that bring these folks to chastise the cloud giant.

Of course, I am one of these critics. I don’t deny that I am not. But I read this situation from a multicloud hyperbole of which I am not a fan. Too much multicloud whitewashing by vendors trying to pitch multicloud as a disaster recovery solution without understanding that this is easier said than done.

Continue reading