Storage IO straight to GPU

The parallel processing power of the GPU (Graphics Processing Unit) cannot be denied. One year ago, nVidia® overtook Intel® in market capitalization. And today, they have doubled their market cap lead over Intel®,  [as of July 2, 2021] USD$510.53 billion vs USD$229.19 billion.

Thus it is not surprising that storage architectures are changing from the CPU-centric paradigm to take advantage of the burgeoning prowess of the GPU. And 2 announcements in the storage news in recent weeks have caught my attention – Windows 11 DirectStorage API and nVidia® Magnum IO GPUDirect® Storage.

nVidia GPU

Exciting the gamers

The Windows DirectStorage API feature is only available in Windows 11. It was announced as part of the Xbox® Velocity Architecture last year to take advantage of the high I/O capability of modern day NVMe SSDs. DirectStorage-enabled applications and games have several technologies such as D3D Direct3D decompression/compression algorithm designed for the GPU, and SFS Sampler Feedback Streaming that uses the previous rendered frame results to decide which higher resolution texture frames to be loaded into memory of the GPU and rendered for the real-time gaming experience.

Continue reading

Is General Purpose Object Storage disenfranchised?

[Disclosure: I am invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees will be covered by GestaltIT, the organizer and I am not obligated to blog or promote the vendors’ technologies to be presented at this event. The content of this blog is of my own opinions and views]

This is NOT an advertisement for coloured balls.

This is the license to brag for the vendors in the next 2 weeks or so, as we approach the 2020 new year. This, of course, is the latest 2019 IDC Marketscape for Object-based Storage, released last week.

My object storage mentions

I have written extensively about Object Storage since 2011. With different angles and perspectives, here are some of them:

Continue reading

Thinking small to solve Big

[This article was posted in my LinkedIn at https://www.linkedin.com/pulse/thinking-small-solve-big-chin-fah-heoh/ on Sep 9th 2019]

The world’s economy has certainly turned. And organizations, especially the SMEs, are demanding more. There were times that many technology vendors and their tier 1 systems integrators could get away with plenty of high level hobnobbing, and showering the prospect with their marketing wow-factor. But those fancy, smancy days are drying up and SMEs now do a lot of research and demand a more elaborate and a more comprehensive technology solution to their requirements.

The SMEs have the same problems faced by the larger organizations. They want more data stored, protected and recoverable, and maximize the value of data. However, their risk factors are much higher than the larger enterprises, because a disruption or a simple breakdown could affect their business and operations far greater than larger organizations. In most situations, they have no safety net.

So, the past 3 odd years, I have learned that as a technology solution provider, as a systems integrator to SMEs, I have to be on-the-ball with their pains all the time. And I have to always remember that they do not have the deep pockets, especially when the economy in Malaysia has been soft for years.

That is why I have gravitated to technology solutions that matter to the SMEs and gentle to their pockets as well. Take for instance a small company called Itxotic I discovered earlier this year. Itxotic is a 100% Malaysian home-grown technology startup, focusing on customized industry intelligence, notably computer vision AI. Their prominent technology include defect detection in a manufacturing production line.

 

At the Enterprise level, it is easy for large technology providers like Hitachi or GE or Siemens to peddle similar high-tech solutions to SMEs requirements. But this would come with a price tag of hundreds of thousands of ringgit. SMEs will balk at such a large investment because the price tag is definitely something not comprehensible to the SME factories. That is why I gravitated to the small thinking of Itxotic, where their small, yet powerful technology solves big problems in the SMEs.

And this came about when more Industry 4.0 opportunities started to come into my radar. Similarly, I was also approached to look into a edge-network data analytics technology to be integrated into PLCs (programmable logic controllers). At present, the industry consultants who invited me, are peddling a foreign technology solution, and the technology costs RM13,000 per CPU core. In a typical 4-core processor IPC (industrial PC), that is a whopping RM52,000, minus the hardware and integration services. This can easily drive up the selling price of over RM100K, again, a price tag that will trigger a mini heart attack with the SMEs.

I am tasked by the industry consultants to design a more cost-friendly, aka cheaper solution and today, we are already building an alternative with Apache Kafka, its connectors and Grafana for visual reporting. And I think the cost to build this alternative technology will be probably 70-80% cheaper than the one they are reselling now. The “think small, solve Big” mantra is beginning to take hold, and I am excited about it.

In the “small” mantra, I mean to be intimate and humble with the end users. One lesson I have learned over the past years is, the SMEs count on their technology partners to be with them. They have no room for failure because a costly failure is likely to be devastating to their operations and business. Know the technology you are pitching well, so that the SMEs are confident that you can deliver, not some over-the-top high-level technology pitch. Look deep into the technology integration with their existing technology and operations, and carefully and meticulously craft and curate a well mapped plan for them. Commit to their journey to ensure their success.

I have often seen technology vendors and resellers leaving SMEs high and dry when it comes to something outside their scope, and this has been painful. That is why this isn’t a downgrade for me when I started working with the SMEs more often in the past 3 years, even though I have served the enterprise for more than 25 years. This invaluable lesson is an upgrade for me to serve my SME customers better.

Continue reading

Storage Performance Considerations for AI Data Paths

The hype of Deep Learning (DL), Machine Learning (ML) and Artificial Intelligence (AI) has reached an unprecedented frenzy. Every infrastructure vendor from servers, to networking, to storage has a word to say or play about DL/ML/AI. This prompted me to explore this hyped ecosystem from a storage perspective, notably from a storage performance requirement point-of-view.

One question on my mind

There are plenty of questions on my mind. One stood out and that is related to storage performance requirements.

Reading and learning from one storage technology vendor to another, the context of everyone’s play against their competitors seems to be  “They are archaic, they are legacy. Our architecture is built from ground up, modern, NVMe-enabled“. And there are more juxtaposing, but you get the picture – “We are better, no doubt“.

Are the data patterns and behaviours of AI different? How do they affect the storage design as the data moves through the workflow, the data paths and the lifecycle of the AI ecosystem?

Continue reading

Whither HPC, HPE?

HPE is acquiring Cray Inc. Almost 3 years ago, HPE acquired SGI. Back in 2017, HPE partnered WekaIO, and invested big in the latest Series C funding of WekaIO just weeks ago.

Cray, SGI and WekaIO are all strong HPC technology companies. Given the strong uptick in the HPC market, especially commercial HPC, we cannot deny HPE’s ambition to become the top SuperComputing and HPC vendor in the industry. Continue reading

We got to keep more data

Guess which airport has won the most awards in the annual Skytrax list? Guess which airport won 480 awards since its opening in 1981? Guess how this airport did it?

Data Analytics gives the competive edge.

Serving and servicing more than 65 million passengers and travellers in 2018, and growing, Changi Airport Singapore sets a very high level customer service. And it does it with the help of technology, something they call Smart (Service Management through Analytics and Resource Transformation) Airport. In an ultra competitive and cut-throat airline business, the deep integration of customer-centric services and the ultimate traveller’s experience are crucial to the survival and growth of airlines. And it has definitely helped Singapore Airlines to be the world’s best airlines in 2018, its 4th win.

To achieve that, Changi Airport relies on technology and lots of relevant data for deep insights on how to serve its customers better. The details are well described in this old news article.

Keep More Relevant Data for Greater Insights

When I mean more data, I do not mean every single piece of data. Data has to be relevant to be useful.

How do we get more insights? How can we teach systems to learn? How to we develop artificial intelligence systems? By having more relevant data feeding into data analytics systems, machine learning and such.

As such, a simple framework for building from the data ingestion, to data repositories to outcomes such as artificial intelligence, predictive and recommendations systems, automation and new data insights isn’t difficult to understand. The diagram below is a high level overview of what I work with most of the time. Continue reading

Sexy HPC storage is all the rage

HPC is sexy

There is no denying it. HPC is sexy. HPC Storage is just as sexy.

Looking at the latest buzz from Super Computing Conference 2018 which happened in Dallas 2 weeks ago, the number of storage related vendors participating was staggering. Panasas, Weka.io, Excelero, BeeGFS, are the ones that I know because I got friends posting their highlights. Then there are the perennial vendors like IBM, Dell, HPE, NetApp, Huawei, Supermicro, and so many more. A quick check on the SC18 website showed that there were 391 exhibitors on the floor.

And this is driven by the unrelentless demand for higher and higher performance of computing, and along with it, the demands for faster and faster storage performance. Commercialization of Artificial Intelligence (AI), Deep Learning (DL) and newer applications and workloads together with the traditional HPC workloads are driving these ever increasing requirements. However, most enterprise storage platforms were not designed to meet the demands of these new generation of applications and workloads, as many have been led to believe. Why so?

I had a couple of conversations with a few well known vendors around the topic of HPC Storage. And several responses thrown back were to put Flash and NVMe to solve the high demands of HPC storage performance. In my mind, these responses were too trivial, too irresponsible. So I wanted to write this blog to share my views on HPC storage, and not just about its performance.

The HPC lines are blurring

I picked up this video (below) a few days ago. It was insideHPC Rich Brueckner interview with Dr. Goh Eng Lim, HPE CTO and renowned HPC expert about the convergence of both traditional and commercial HPC applications and workloads.

I liked the conversation in the video because it addressed the 2 different approaches. And I welcomed Dr. Goh’s invitation to the Commercial HPC community to work with the Traditional HPC vendors to help push the envelope towards Exascale SuperComputing.

Continue reading

Own the Data Pipeline

[Preamble: I was a delegate of Storage Field Day 15 from Mar 7-9, 2018. My expenses, travel and accommodation were paid for by GestaltIT, the organizer and I was not obligated to blog or promote the technologies presented at this event. The content of this blog is of my own opinions and views]

I am a big proponent of Go-to-Market (GTM) solutions. Technology does not stand alone. It must be in an ecosystem, and in each industry, in each segment of each respective industry, every ecosystem is unique. And when we amalgamate data, the storage infrastructure technologies and the data management into the ecosystem, we reap the benefits in that ecosystem.

Data moves in the ecosystem, from system to system, north to south, east to west and vice versa, random, sequential, ad-hoc. Data acquires different statuses, different roles, different relevances in its lifecycle through the ecosystem. From it, we derive the flow, a workflow of data creating a data pipeline. The Data Pipeline concept has been around since the inception of data.

To illustrate my point, I created one for the Oil & Gas – Exploration & Production (EP) upstream some years ago.

 

Continue reading

The leapfrog game in Asia with HPC

Brunei, a country rich in oil and gas, is facing a crisis. Their oil & gas reserves are rapidly running dry and expected to be depleted within 2 decades. Their deep dependency on oil and gas, once the boon of their economy, is now the bane of their future.

Since 2000, I have been in and out of Brunei and got involved in several engagements there. It is a wonderful and peaceful country with friendly people, always welcoming visitors with open hearts. The country has prospered for decades, with its vast oil riches but in the past few years, the oil prices have been curbed. The profits of oil and gas no longer justify the costs of exploration and production.

2 years ago, I started pitching a new economy generator for the IT partners in Brunei. One that I believe will give a country like Brunei the ability to leapfrog their neighbours in South East Asia, which is to start build a High Performance Computing (HPC)-as-a-Service (HPC-as-a-Service) type of business.

Why HPC? Why do I think HPC will give a developing country like Brunei super powers in the digital economy?

Continue reading

NetApp and IBM gotta take risks

[Preamble: I was a delegate of Storage Field Day 15 from Mar 7-9, 2018. My expenses, travel and accommodation were paid for by GestaltIT, the organizer and I was not obligated to blog or promote the technologies presented at this event. The content of this blog is of my own opinions and views]

Storage Field Day 15 was full of technology. There were a few avant garde companies in the line-up which I liked but unfortunately NetApp and IBM were the 2 companies that came in at the least interesting end of the spectrum.

IBM presented their SpectrumProtect Plus. The data protection space, especially backup isn’t exactly my forte when it comes to solution architecture but I know enough to get by. However, as IBM presented, there were some many questions racing through my mind. I was interrupting myself so much because almost everything presented wasn’t new to me. “Wait a minute … didn’t Company X already had this?” or “Company Y had this years ago” or “Isn’t this…??

I was questioning myself to validate my understanding of the backup tech shared by the IBM SpectrumProtect Plus team. And they presented with such passion and gusto which made me wonder if I was wrong in the first place. Maybe my experience and knowledge in the backup software space weren’t good enough. But then the chatter in the SFD15 Slack channel started pouring in. Comments, unfortunately were mostly negative, and jibes became jokes. One comment, in particular, nailed it. “This is Veeam 0.2“, and then someone else downgraded to version 0.1.

Continue reading