The Network is Still the Computer

[Preamble: I have been invited by  GestaltIT as a delegate to their TechFieldDay from Oct 17-19, 2018 in the Silicon Valley USA. My expenses, travel and accommodation are covered by GestaltIT, the organizer and I was not obligated to blog or promote their technologies presented at this event. The content of this blog is of my own opinions and views]

Sun Microsystems coined the phrase “The Network is the Computer“. It became one of the most powerful ideologies in the computing world, but over the years, many technology companies have tried to emulate and practise the mantra, but fell short.

I have never heard of Drivescale. It wasn’t in my radar until the legendary NFS guru, Brian Pawlowski joined them in April this year. Beepy, as he is known, was CTO of NetApp and later at Pure Storage, and held many technology leadership roles, including leading the development of NFSv3 and v4.

Prior to Tech Field Day 17, I was given some “homework”. Stephen Foskett, Chief Cat Herder (as he is known) of Tech Field Days and Storage Field Days, highly recommended Drivescale and asked the delegates to pick up some notes on their technology. Going through a couple of the videos, Drivescale’s message and philosophy resonated well with me. Perhaps it was their Sun Microsystems DNA? Many of the Drivescale team members were from Sun, and I was previously from Sun as well. I was drinking Sun’s Kool Aid by the bucket loads even before I graduated in 1991, and so what Drivescale preached made a lot of sense to me.Drivescale is all about Scale-Out Architecture at the webscale level, to address the massive scale of data processing. To understand deeper, we must think about “Data Locality” and “Data Mobility“. I frequently use these 2 “points of discussion” in my consulting practice in architecting and designing data center infrastructure. The gist of data locality is simple – the closer the data is to the processing, the cheaper/lightweight/efficient it gets. Moving data – the data mobility part – is expensive.

Continue reading

The Dell EMC Data Bunker

[Preamble: I have been invited by  GestaltIT as a delegate to their TechFieldDay from Oct 17-19, 2018 in the Silicon Valley USA. My expenses, travel and accommodation are covered by GestaltIT, the organizer and I was not obligated to blog or promote their technologies presented at this event. The content of this blog is of my own opinions and views]

Another new announcement graced the Tech Field Day 17 delegates this week. Dell EMC Data Protection group announced their Cyber Recovery solution. The Cyber Recovery Vault solution and services is touted as the “The Last Line of Data Protection Defense against Cyber-Attacks” for the enterprise.

Security breaches and ransomware attacks have been rampant, and they are reeking havoc to organizations everywhere. These breaches and attacks cost businesses tens of millions, or even hundreds, and are capable of bring these businesses to their knees. One of the known practices is to corrupt backup metadata or catalogs, rendering operational recovery helpless before these perpetrators attack the primary data source. And there are times where the malicious and harmful agent could be dwelling in the organization’s network or servers for long period of times, launching and infecting primary images or gold copies of corporate data at the opportune time.

The Cyber Recovery (CR) solution from Dell EM focuses on Recovery of an Isolated Copy of the Data. The solution isolates strategic and mission critical secondary data and preserves the integrity and sanctity of the secondary data copy. Think of the CR solution as the data bunker, after doomsday has descended.

The CR solution is based on the Data Domain platforms. Describing from the diagram below, data backup occurs in the corporate network to a Data Domain appliance platform as the backup repository. This is just the usual daily backup, and is for operational recovery.

Diagram from Storage Review. URL Link: https://www.storagereview.com/dell_emc_releases_cyber_recovery_software

Continue reading

The Commvault 5Ps of change

[Preamble: I have been invited by Commvault via GestaltIT as a delegate to their Commvault GO conference from Oct 9-11, 2018 in Nashville, TN, USA. My expenses, travel and accommodation are paid by Commvault, the organizer and I was not obligated to blog or promote their technologies presented at this event. The content of this blog is of my own opinions and views]

I am a delegate of Commvault GO 2018 happening now in Nashville, Tennessee. I was also a delegate of Commvault GO 2017 held at National Harbor, Washington D.C. Because of scheduling last year, I only managed to stay about a day and a half before flying off to the West Coast. This year I was given the opportunity to experience the full conference at Commvault GO 2018. And I was able to savour the energy, the mindset and the culture of Commvault this time around.

Make no mistakes folks, BIG THINGS are happening with Commvault. I can feel it with their people, with their partners and their customers at the GO conference. How so?

For one, Commvault is making big changes, from People, Process, Pricing, Products and Perception (that’s 5 Ps). Starting with Products, they have consolidated from 20+ products into 4, and simplifying the perception of how the industry sees Commvault. The diagram below shows the 4 products portfolio.

Continue reading

Let there be light with Commvault Activate

[Preamble: I have been invited by Commvault via GestaltIT as a delegate to their Commvault GO conference from Oct 9-11, 2018 in Nashville, TN, USA. My expenses, travel and accommodation are paid by Commvault, the organizer and I was not obligated to blog or promote their technologies presented at this event. The content of this blog is of my own opinions and views]

Nobody sees well in the dark.

I am piqued and I want to know more about Commvault Activate. The conversation started after lunch yesterday as the delegates were walking back to the Gaylord Opryland Convention Center. I was walking next to Patrick McGrath, one of Commvault marketing folks, and we struck up a conversation in the warm breeze. Patrick started sharing a bit of Commvault Activate and what it could do and the possibilities of many relevant business cases for the solution.

There was a dejà vu moment, bringing my thoughts back to mid-2009. I was just invited by a friend to join him to restructure his company, Real Data Matrix (RDM). They were a NetApp distributor, then Platinum reseller in the early and mid-2000s and they had fell into hard times. Most of their technical team had left them, putting them in a spot to retain one of the largest NetApp support contract in Malaysia at the time.

I wanted to expand on their NetApp DNA and I started to seek out complementary solutions to build on that DNA. Coming out of my gig at EMC, there was an interesting solution which tickled my fancy – VisualSRM. So, I went about seeking the most comprehensive SRM (storage resource management) solution for RDM, one which has the widest storage platforms support. I found Tek-Tools Software and I moved that RDM sign up as their reseller. We got their SE/Developer, Aravind Kurapati, from India to train the RDM engineers. We were ready to hit the market late-2009/early-2010 but a few weeks later, Tek-Tools was acquired by Solarwinds.

Long story short, my mindset about SRM was “If you can’t see your storage resource, you can’t manage your storage“.  Resource visibility is so important in SRM, and the same philosophy applies to Data as well. That’s where Commvault Activate comes in. More than ever, Data Insights is already the biggest differentiator in the Data-Driven transformation in any modern business today. Commvault Activate is the Data Insights that shines the light to all the data in every organization.

After that casual chat with Patrick, more details came up in the early access to Commvault embargoed announcements later that afternoon. Commvault Activate announcement came up in my Twitter feed.

Commvault Activate has a powerful dynamic Index Engine called the Commvault 4D Index and it is responsible to search, discover and learn about different types of data, data context and relationships within the organization. I picked up more information as the conference progressed and found out that the technology behind the Commvault Activate is based on the Apache Lucene Solr enterprise search and indexing platform, courtesy of Lucidworks‘ technology. Suddenly I had a recall moment. I had posted the Commvault and Lucidworks partnership a few months back in my SNIA Malaysia Facebook community. The pictures connected. You can read about the news of the partnership here at Forbes.

Continue reading

The Malaysian Openstack storage conundrum

The Openstack blippings on my radar have ratcheted up this year. I have been asked to put together the IaaS design several times, either with the flavours of RedHat or Ubuntu, and it’s a good thing to see the Openstack interest level going up in the Malaysian IT scene. Coming into its 8th year, Openstack has become a mature platform but in the storage projects of Openstack, my observations tell me that these storage-related projects are not as well known as we speak.

I was one of the speakers at the Openstack Malaysia 8th Summit over a month ago. I started my talk with question – “Can anyone name the 4 Openstack storage projects?“. The response from the floor was “Swift, Cinder, Ceph and … (nobody knew the 4th one)” It took me by surprise when the floor almost univocally agreed that Ceph is one of the Openstack projects but we know that Ceph isn’t one. Ceph? An Openstack storage project?

Besides Swift, Cinder, there is Glance (depending on how you look at it) and the least known .. Manila.

I have also been following on many Openstack Malaysia discussions and discussion groups for a while. That Ceph response showed the lack of awareness and knowledge of the Openstack storage projects among the Malaysian IT crowd, and it was a difficult issue to tackle. The storage conundrum continues to perplex me because many whom I have spoken to seemed to avoid talking about storage and viewing it like a dark art or some voodoo thingy.

I view storage as the cornerstone of the 3 infrastructure pillars  – compute, network and storage – of Openstack or any software-defined infrastructure stack for that matter. So it is important to get an understanding the Openstack storage projects, especially Cinder.

Cinder is the abstraction layer that gives management and control to block storage beneath it. In a nutshell, it allows Openstack VMs and applications consume block storage in a consistent and secure way, regardless of the storage infrastructure or technology beneath it. This is achieved through the cinder-volume service which is a driver most storage vendors integrate with (as shown in the diagram below).

Diagram in slides is from Mirantis found at https://www.slideshare.net/mirantis/openstack-architecture-43160012

Diagram in slides is from Mirantis found at https://www.slideshare.net/mirantis/openstack-architecture-43160012

Cinder-volume together with cinder-api, and cinder-scheduler, form the Block Storage Services for Openstack. There is another service, cinder-backup which integrates with Openstack Swift but in my last check, this service is not as popular as cinder-volume, which is widely supported by many storage vendors with both Fibre Channel and iSCSi implementations, and in a few vendors, with NFS and SMB as well. Continue reading

Industry 4.0 secret gem with Dell

[Preamble: I have been invited by Dell Technologies as a delegate to their upcoming Dell Technologies World from Apr 30-May 2, 2018 in Las Vegas, USA. My expenses, travel and accommodation will be paid by Dell Technologies, the organizer and I was not obligated to blog or promote the technologies presented at this event. The content of this blog is of my own opinions and views]

This may seem a little strange. How does Industry 4.0 relate to Dell Technologies?

Recently, I was involved in an Industry 4.0 consortium called Data Industry 4.0 (di 4.0). The objective of the consortium is to combine the foundations of 5S (seiri, seiton, seiso, seiketsu, and shitsuke), QRQC (Quick Response Quality Control) and Kaizen methodologies with the 9 pillars of Industry 4.0 with a strong data insight focus.

Industry 4.0 has been the latest trend in new technologies in the manufacturing world. It is sweeping the manufacturing industry segment by storm, leading with the nine pillars of Industry 4.0:

  • Horizontal and Vertical System Integration
  • Industrial Internet of Things
  • Simulation
  • Additive Manufacturing
  • Cloud Computing
  • Augmented Reality
  • Big Data and Analytics
  • Cybersecurity
  • Autonomous Robots

Continue reading

Own the Data Pipeline

[Preamble: I was a delegate of Storage Field Day 15 from Mar 7-9, 2018. My expenses, travel and accommodation were paid for by GestaltIT, the organizer and I was not obligated to blog or promote the technologies presented at this event. The content of this blog is of my own opinions and views]

I am a big proponent of Go-to-Market (GTM) solutions. Technology does not stand alone. It must be in an ecosystem, and in each industry, in each segment of each respective industry, every ecosystem is unique. And when we amalgamate data, the storage infrastructure technologies and the data management into the ecosystem, we reap the benefits in that ecosystem.

Data moves in the ecosystem, from system to system, north to south, east to west and vice versa, random, sequential, ad-hoc. Data acquires different statuses, different roles, different relevances in its lifecycle through the ecosystem. From it, we derive the flow, a workflow of data creating a data pipeline. The Data Pipeline concept has been around since the inception of data.

To illustrate my point, I created one for the Oil & Gas – Exploration & Production (EP) upstream some years ago.

 

Continue reading

The leapfrog game in Asia with HPC

Brunei, a country rich in oil and gas, is facing a crisis. Their oil & gas reserves are rapidly running dry and expected to be depleted within 2 decades. Their deep dependency on oil and gas, once the boon of their economy, is now the bane of their future.

Since 2000, I have been in and out of Brunei and got involved in several engagements there. It is a wonderful and peaceful country with friendly people, always welcoming visitors with open hearts. The country has prospered for decades, with its vast oil riches but in the past few years, the oil prices have been curbed. The profits of oil and gas no longer justify the costs of exploration and production.

2 years ago, I started pitching a new economy generator for the IT partners in Brunei. One that I believe will give a country like Brunei the ability to leapfrog their neighbours in South East Asia, which is to start build a High Performance Computing (HPC)-as-a-Service (HPC-as-a-Service) type of business.

Why HPC? Why do I think HPC will give a developing country like Brunei super powers in the digital economy?

Continue reading