Pondering Redhat’s future with IBM

I woke up yesterday morning with a shocker of a news. IBM announced that they were buying Redhat for USD34 billion. Never in my mind that Redhat would sell but I guess that USD190.00 per share was too tempting. Redhat (RHT) was trading at USD116.68 on the previous Friday’s close.

Redhat is one of my favourite technology companies. I love their Linux development and progress, and I use a lot of Fedora and CentOS in my hobbies. I started with Redhat back in 2000, when I became obsessed to get my RHCE (Redhat Certified Engineer). I recalled on almost every weekend (Saturday and Sunday) back in 2002 when I was in the office, learning Redhat, and hacking scripts to be really good at it. I got certified with RHCE 4 with a 96% passing mark, and I was very proud of my certification.

One of my regrets was not joining Redhat in 2006. I was offered the job as an SE by Josep Garcia, and the very first position in Malaysia. Instead, I took up the Hitachi Data Systems job to helm the project implementation and delivery for the Shell GUSto project. It might have turned out differently if I did.

The IBM acquisition of Redhat left a poignant feeling in me. In many ways, Redhat has been the shining star of Linux. They are the only significant one left leading the charge of open source. They are the largest contributors to the Openstack projects and continue to support the project strongly whilst early protagonists like HPE, Cisco and Intel have reduced their support. They are of course, the perennial top 3 contributors to the Linux kernel since the very early days. And Redhat continues to contribute to projects such as containers and Kubernetes and made that commitment deeper with their recent acquisition of CoreOS a few months back.

Continue reading

Oracle Cloud Infrastructure to prove skeptics wrong

[Preamble: I have been invited by  GestaltIT as a delegate to their TechFieldDay from Oct 17-19, 2018 in the Silicon Valley USA. My expenses, travel and accommodation are covered by GestaltIT, the organizer and I was not obligated to blog or promote their technologies presented at this event. The content of this blog is of my own opinions and views]

The much maligned Oracle Cloud is getting a fresh reboot, starting with their Oracle Cloud Infrastructure (OCI), and significant enhancements and technology updates were announced at the Oracle Open World this week. I had the privilege to hear about Oracle Cloud’s new attack plan when they presented at Tech Field Day 17 last week.

Oracle Cloud has not have the best of days in recent months. Thomas Kurian’s resignation as their President of Product Development was highly publicized in a disagreement with CTO and founder, Larry Ellison over cloud software strategy. Then there was an on-going lawsuit about how Oracle was misrepresenting their cloud revenue growth, which puts Oracle in a bad light.

On the local front here in Malaysia, I have heard from the grapevine of the aggressive nature of Oracle personnel pushing partners and customers to adopt their cloud services using legal scare tactics on their database licensing. A buddy of mine, who was previously the cloud business development manager at CTC Global, also shared Oracle’s cloud shortcomings compared to Amazon Web Service and Microsoft Azure a year ago.

Oracle Cloud Infrastructure team aimed to turnover the bad perceptions, starting with the delegates of Tech Field Day 17, including yours truly.Their strategy was clear. Oracle Cloud Infrastructure runs the highest performance and the highest enterprise grade Infrastructure-as-a-Service (IaaS), bar none. Unlike the IBM Cloud, which in my opinion is a wishy-washy cloud service platform, Oracle Cloud’s ambition is solid.

They did a demo on JDEdwards EnterpriseOne application, and they continue to demonstrate their prowess running the highest performance computing experience ever, for all enterprise-grade workload. And that enterprise pedigree is clear.

Just this week, Amazon Prime Day had an outage. Amazon is in the process of weaning Oracle database from their entire ecosystem by 2020, and this outage clearly showed that the Oracle database and the enterprise applications would only run best on Oracle Cloud Infrastructure.

Continue reading

The Network is Still the Computer

[Preamble: I have been invited by  GestaltIT as a delegate to their TechFieldDay from Oct 17-19, 2018 in the Silicon Valley USA. My expenses, travel and accommodation are covered by GestaltIT, the organizer and I was not obligated to blog or promote their technologies presented at this event. The content of this blog is of my own opinions and views]

Sun Microsystems coined the phrase “The Network is the Computer“. It became one of the most powerful ideologies in the computing world, but over the years, many technology companies have tried to emulate and practise the mantra, but fell short.

I have never heard of Drivescale. It wasn’t in my radar until the legendary NFS guru, Brian Pawlowski joined them in April this year. Beepy, as he is known, was CTO of NetApp and later at Pure Storage, and held many technology leadership roles, including leading the development of NFSv3 and v4.

Prior to Tech Field Day 17, I was given some “homework”. Stephen Foskett, Chief Cat Herder (as he is known) of Tech Field Days and Storage Field Days, highly recommended Drivescale and asked the delegates to pick up some notes on their technology. Going through a couple of the videos, Drivescale’s message and philosophy resonated well with me. Perhaps it was their Sun Microsystems DNA? Many of the Drivescale team members were from Sun, and I was previously from Sun as well. I was drinking Sun’s Kool Aid by the bucket loads even before I graduated in 1991, and so what Drivescale preached made a lot of sense to me.Drivescale is all about Scale-Out Architecture at the webscale level, to address the massive scale of data processing. To understand deeper, we must think about “Data Locality” and “Data Mobility“. I frequently use these 2 “points of discussion” in my consulting practice in architecting and designing data center infrastructure. The gist of data locality is simple – the closer the data is to the processing, the cheaper/lightweight/efficient it gets. Moving data – the data mobility part – is expensive.

Continue reading

The Dell EMC Data Bunker

[Preamble: I have been invited by  GestaltIT as a delegate to their TechFieldDay from Oct 17-19, 2018 in the Silicon Valley USA. My expenses, travel and accommodation are covered by GestaltIT, the organizer and I was not obligated to blog or promote their technologies presented at this event. The content of this blog is of my own opinions and views]

Another new announcement graced the Tech Field Day 17 delegates this week. Dell EMC Data Protection group announced their Cyber Recovery solution. The Cyber Recovery Vault solution and services is touted as the “The Last Line of Data Protection Defense against Cyber-Attacks” for the enterprise.

Security breaches and ransomware attacks have been rampant, and they are reeking havoc to organizations everywhere. These breaches and attacks cost businesses tens of millions, or even hundreds, and are capable of bring these businesses to their knees. One of the known practices is to corrupt backup metadata or catalogs, rendering operational recovery helpless before these perpetrators attack the primary data source. And there are times where the malicious and harmful agent could be dwelling in the organization’s network or servers for long period of times, launching and infecting primary images or gold copies of corporate data at the opportune time.

The Cyber Recovery (CR) solution from Dell EM focuses on Recovery of an Isolated Copy of the Data. The solution isolates strategic and mission critical secondary data and preserves the integrity and sanctity of the secondary data copy. Think of the CR solution as the data bunker, after doomsday has descended.

The CR solution is based on the Data Domain platforms. Describing from the diagram below, data backup occurs in the corporate network to a Data Domain appliance platform as the backup repository. This is just the usual daily backup, and is for operational recovery.

Diagram from Storage Review. URL Link: https://www.storagereview.com/dell_emc_releases_cyber_recovery_software

Continue reading

The Commvault 5Ps of change

[Preamble: I have been invited by Commvault via GestaltIT as a delegate to their Commvault GO conference from Oct 9-11, 2018 in Nashville, TN, USA. My expenses, travel and accommodation are paid by Commvault, the organizer and I was not obligated to blog or promote their technologies presented at this event. The content of this blog is of my own opinions and views]

I am a delegate of Commvault GO 2018 happening now in Nashville, Tennessee. I was also a delegate of Commvault GO 2017 held at National Harbor, Washington D.C. Because of scheduling last year, I only managed to stay about a day and a half before flying off to the West Coast. This year I was given the opportunity to experience the full conference at Commvault GO 2018. And I was able to savour the energy, the mindset and the culture of Commvault this time around.

Make no mistakes folks, BIG THINGS are happening with Commvault. I can feel it with their people, with their partners and their customers at the GO conference. How so?

For one, Commvault is making big changes, from People, Process, Pricing, Products and Perception (that’s 5 Ps). Starting with Products, they have consolidated from 20+ products into 4, and simplifying the perception of how the industry sees Commvault. The diagram below shows the 4 products portfolio.

Continue reading

Let there be light with Commvault Activate

[Preamble: I have been invited by Commvault via GestaltIT as a delegate to their Commvault GO conference from Oct 9-11, 2018 in Nashville, TN, USA. My expenses, travel and accommodation are paid by Commvault, the organizer and I was not obligated to blog or promote their technologies presented at this event. The content of this blog is of my own opinions and views]

Nobody sees well in the dark.

I am piqued and I want to know more about Commvault Activate. The conversation started after lunch yesterday as the delegates were walking back to the Gaylord Opryland Convention Center. I was walking next to Patrick McGrath, one of Commvault marketing folks, and we struck up a conversation in the warm breeze. Patrick started sharing a bit of Commvault Activate and what it could do and the possibilities of many relevant business cases for the solution.

There was a dejà vu moment, bringing my thoughts back to mid-2009. I was just invited by a friend to join him to restructure his company, Real Data Matrix (RDM). They were a NetApp distributor, then Platinum reseller in the early and mid-2000s and they had fell into hard times. Most of their technical team had left them, putting them in a spot to retain one of the largest NetApp support contract in Malaysia at the time.

I wanted to expand on their NetApp DNA and I started to seek out complementary solutions to build on that DNA. Coming out of my gig at EMC, there was an interesting solution which tickled my fancy – VisualSRM. So, I went about seeking the most comprehensive SRM (storage resource management) solution for RDM, one which has the widest storage platforms support. I found Tek-Tools Software and I moved that RDM sign up as their reseller. We got their SE/Developer, Aravind Kurapati, from India to train the RDM engineers. We were ready to hit the market late-2009/early-2010 but a few weeks later, Tek-Tools was acquired by Solarwinds.

Long story short, my mindset about SRM was “If you can’t see your storage resource, you can’t manage your storage“.  Resource visibility is so important in SRM, and the same philosophy applies to Data as well. That’s where Commvault Activate comes in. More than ever, Data Insights is already the biggest differentiator in the Data-Driven transformation in any modern business today. Commvault Activate is the Data Insights that shines the light to all the data in every organization.

After that casual chat with Patrick, more details came up in the early access to Commvault embargoed announcements later that afternoon. Commvault Activate announcement came up in my Twitter feed.

Commvault Activate has a powerful dynamic Index Engine called the Commvault 4D Index and it is responsible to search, discover and learn about different types of data, data context and relationships within the organization. I picked up more information as the conference progressed and found out that the technology behind the Commvault Activate is based on the Apache Lucene Solr enterprise search and indexing platform, courtesy of Lucidworks‘ technology. Suddenly I had a recall moment. I had posted the Commvault and Lucidworks partnership a few months back in my SNIA Malaysia Facebook community. The pictures connected. You can read about the news of the partnership here at Forbes.

Continue reading

Industry 4.0 secret gem with Dell

[Preamble: I have been invited by Dell Technologies as a delegate to their upcoming Dell Technologies World from Apr 30-May 2, 2018 in Las Vegas, USA. My expenses, travel and accommodation will be paid by Dell Technologies, the organizer and I was not obligated to blog or promote the technologies presented at this event. The content of this blog is of my own opinions and views]

This may seem a little strange. How does Industry 4.0 relate to Dell Technologies?

Recently, I was involved in an Industry 4.0 consortium called Data Industry 4.0 (di 4.0). The objective of the consortium is to combine the foundations of 5S (seiri, seiton, seiso, seiketsu, and shitsuke), QRQC (Quick Response Quality Control) and Kaizen methodologies with the 9 pillars of Industry 4.0 with a strong data insight focus.

Industry 4.0 has been the latest trend in new technologies in the manufacturing world. It is sweeping the manufacturing industry segment by storm, leading with the nine pillars of Industry 4.0:

  • Horizontal and Vertical System Integration
  • Industrial Internet of Things
  • Simulation
  • Additive Manufacturing
  • Cloud Computing
  • Augmented Reality
  • Big Data and Analytics
  • Cybersecurity
  • Autonomous Robots

Continue reading

Own the Data Pipeline

[Preamble: I was a delegate of Storage Field Day 15 from Mar 7-9, 2018. My expenses, travel and accommodation were paid for by GestaltIT, the organizer and I was not obligated to blog or promote the technologies presented at this event. The content of this blog is of my own opinions and views]

I am a big proponent of Go-to-Market (GTM) solutions. Technology does not stand alone. It must be in an ecosystem, and in each industry, in each segment of each respective industry, every ecosystem is unique. And when we amalgamate data, the storage infrastructure technologies and the data management into the ecosystem, we reap the benefits in that ecosystem.

Data moves in the ecosystem, from system to system, north to south, east to west and vice versa, random, sequential, ad-hoc. Data acquires different statuses, different roles, different relevances in its lifecycle through the ecosystem. From it, we derive the flow, a workflow of data creating a data pipeline. The Data Pipeline concept has been around since the inception of data.

To illustrate my point, I created one for the Oil & Gas – Exploration & Production (EP) upstream some years ago.

 

Continue reading