4 Digital Workplace Moves after COVID-19

[ Note: This article was published on LinkedIn on March 24, 2020. Here is the link to the original article ]

We live in unprecedented times. Malaysia has been in Movement Control Order (MCO) Day 7, which is basically a controlled lockdown of movements and activities. In many cases, businesses have grounded to a halt, and the landscape has changed forever. The “office” will not always be a premise anymore, and the “meetings” will not be a physical face-to-face conversation to build relationships and trust.

Trust is vital. A couple of weeks ago, I wrote 關係 (Guan Xi), and having to re-invent Trust in a Digital World.

No alt text provided for this image

The impact on organizations and businesses is deep and powerful and so, as we move forward when the COVID-19 pandemic dies down, organizations’ plans in their Digital Transformation strategy will change as well.

Here are 4 technology areas which I think must take precedence for the Digital Workplace in the Digital Transformation strategy.

Software-Defined Wide Area Network (SD-WAN)

Physically connections have been disrupted. Digital connections are on the rise to supplant “networking” in our physical business world, and the pandemic situation just tipped the scale.

Many small medium businesses (SMBs) rely on home broadband, which may be good enough for some. Medium to large organizations have broadband for business. Larger organizations which have deeper pockets might already have MPLS (multiprotocol label switching) or leased line in place. A large portion might have VPN (virtual private network) set up too.

In time, SD-WAN (software-defined wide area network) services should be considered more profoundly. SD-WAN is a more prudent approach that inculcates digital workplace policies such as quality of service (QOS) for critical data connections, allocating network attributes to different data workloads and network traffic, VPN features and most come with enhanced security addendum as well. .

In addition to performance, security and capacity control, SD-WAN implementation helps shape employees’ digital workplace practices but most importantly, redefine the organization’s processes and conditioning employees’ mindsets in the Digital Transformation journey.

 

Video Meetings & Conferencing

The Video Meetings and Conferencing solutions have become the poster child in the present pandemic situation. Zoom, Webex, Microsoft Teams, Skype (it is going away), GoToMeetings and more are dominating the new norm of work. Work from home (WFH) has a totally new meaning now, especially for employees who have been conditioned to work in an “office”.

I had more than 15 Zoom meetings (the free version) last week when the Malaysian MCO started, and Zoom has become a critical part of my business now, and thus, it is time to consider paid solutions like Zoom or Webex as part of an organization’s Digital Workplace plans. These will create the right digital culture for the new Digital Workplace.

Personally I like Uberconference because of their on-hold song. It is sang by their CEO, Alex Cornell. Check out this SoundCloud recording.

File Sharing

Beneath the hallowed halls of video meetings and conferencing, collaboration happens with data shared in files. We have been with file and folders from our C: drives or NAS Home Directories or File Server’s shared drives that these processes are almost second nature to us.

In the face of this COVID-19 pandemic, files and information sharing has become cumbersome. The shared drive is no longer in our network, because we are not in the organization’s LAN and intranet anymore. We are working at home, away from the gigabit network, protected by the organization’s firewall, and was once slaved … err … I mean supported by our IT admins.

The obvious reaction (since you can’t pass thumb drives anymore at present) is to resort to Dropbox, OneDrive, Google Drive and others, and hoping you won’t max out your free capacity. Or email attachments in emails going back and forth, and hoping the mail server will not reject files larger than 10MB.

The fortunate ones have VPN client on their laptops but the network backhaul traffic to the VPN server at the central VPN server, and overloading it to the max. Pretty soon, network connections are dropped, and the performance of file sharing sucks! Big time!

What if your organization is a bank? Or an Oil & Gas company where data protection and data sovereignty dictate the order of the day? All the very-public enterprise file sync and share (EFSS) like Dropbox or OneDrive or Google Drive totally violate the laws of the land, and your organization may be crippled by the inability to do work. After all, files and folders are like the peanut-butter-jelly or the nasi lemak-teh tarik (coconut rice & pulled tea Malaysian breakfast) combo of work. You can’t live without files and folders.

The thoughts of having a PRIVATE on-premises EFSS solution in your organization’s Digital Transformation strategy should be moved from the KIV (keep in view) tray to a defined project in the Digital Transformation programme.

At Katana Logic, we work with Easishare, and it is worth having a serious plan about building your own private file share and sync solution as part of the Digital Workplace.

Security

In such unprecedented times, where our attention is diverted, cybersecurity threats are at its highest. Financial institutions in Malaysia have already been summoned by Malaysia Bank Negara central bank to build the industry’s expectations and confidence through the RMiT framework. Conversations with some end users and IT suppliers to Malaysian banks and other financial institutions unfortunately, revealed the typical lackadaisical attitude to fortify cyber resiliency practices within these organizations. I would presume the importance of cybersecurity and cyber resiliency practices would take a even further back seat with small medium businesses.

On a pessimistic note, ransomware and DDOS (distributed denial-of-service) have been on the rise, and taking advantage of this pandemic situation. NAS, the network attached storage that serves the organization shared files and folders has become ransomware’s favourite target as I have wrote in my blog.

But it does not have to be expensive affair with cybersecurity. Applying a consistent periodical password change, educating employees about phishing emails, using a simple but free port scanners to look at open TCP/UDP ports can be invaluable for small medium businesses. Subscribing to penetration testing (pentest) services at a regular frequency is immensely helpful as well.

In larger organizations, cyber resiliency is more holistic. Putting in layers for defense in depth, CIA (confidentiality, integrity, availability) triad, AAA (authentication, authorization, audit) pro-active measures are all part of the cybersecurity framework. These holistic practices must effect change in people and the processes of how data and things are shared, used, protected and recovered in the whole scheme of things.

Thus organizations must be vigilant and do their due diligence. We must never bat any eye to fortify cyber security and cyber resiliency in the Digital Workplace.

Parting thoughts

We are at our most vulnerable stage of our lifetime but it is almost the best time to understand what is critical to our business. This pandemic is helping to identify the right priorities for Work.

At any level, regardless, organizations have to use the advantage of this COVID-19 situation to assess how it has impacted business. It must look at what worked and what did not in their digital transformation journey so far, and change the parts that were not effective.

I look at the 4 areas of technology that I felt it could make a difference and I am sure there are many more areas to address. So, use this pessimistic times and turn it into an optimistic one when we are back to normalcy. The Digital Workplace has changed forever, and for the better too.

Continue reading

NetApp double stitching Data Fabric

Is NetApp® Data Fabric breaking at the seams that it chose to acquire Talon Storage a few weeks ago?

It was a surprise move and the first thing that came to my mind was “Who is Talon Storage?” I have seen that name appeared in Tech Target and CRN last year but never took the time to go in depth about their technology. I took a quick check of their FAST™ software technology with the video below:

It had the reminiscence of Andrew File System, something I worked on briefly in the 90s and WAFS (Wide Area File System), a technology buzz word in the early to mid-2000s led by Tacit Networks, a company I almost joined with a fellow NetApp-ian back then. WAFS DNA appeared ingrained in Talon Storage, after finding out that Talon’s CEO and Founder, Shirish Phatak, was the architect of Tacit Networks 20 years ago.

Continue reading

StorageGRID gets gritty

[ Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies presented at the event. The content of this blog is of my own opinions and views ]

NetApp® presented StorageGRID® Webscale (SGWS) at Storage Field Day 19 last month. It was timely when the general purpose object storage market, in my humble opinion, was getting disillusioned and almost about to deprive itself of the value of what it was supposed to be.

Cheap and deep“, “Race to Zero” were some of the less storied calls I have come across when discussing about object storage, and it was really de-valuing the merits of object storage as vendors touted their superficial glory of being in the IDC Marketscape for Object-based Storage 2019.

Almost every single conversation I had in the past 3 years was either explaining what object storage is or “That is cheap storage right?

Continue reading

Paradigm shift of Dev to Storage Ops

[ Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies presented at the event. The content of this blog is of my own opinions and views ]

A funny photo (below) came up on my Facebook feed a couple of weeks back. In an honest way, it depicted how a developer would think (or the lack of thinking) about the storage infrastructure designs and models for the applications and workloads. This also reminded me of how DBAs used to diss storage engineers. “I don’t care about storage, as long as it is RAID 10“. That was aeons ago 😉

The world of developers and the world of infrastructure people are vastly different. Since cloud computing birthed, both worlds have collided and programmable infrastructure-as-code (IAC) have become part and parcel of cloud native applications. Of course, there is no denying that there is friction.

Welcome to DevOps!

The Kubernetes factor

Containerized applications are quickly defining the cloud native applications landscape. The container orchestration machinery has one dominant engine – Kubernetes.

In the world of software development and delivery, DevOps has taken a liking to containers. Containers make it easier to host and manage life-cycle of web applications inside the portable environment. It packages up application code other dependencies into building blocks to deliver consistency, efficiency, and productivity. To scale to a multi-applications, multi-cloud with th0usands and even tens of thousands of microservices in containers, the Kubernetes factor comes into play. Kubernetes handles tasks like auto-scaling, rolling deployment, computer resource, volume storage and much, much more, and it is designed to run on bare metal, in the data center, public cloud or even a hybrid cloud.

Continue reading

DellEMC Project Nautilus Re-imagine Storage for Streams

[ Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies presented at this event. The content of this blog is of my own opinions and views ]

Cloud computing will have challenges processing data at the outer reach of its tentacles. Edge Computing, as it melds with the Internet of Things (IoT), needs a different approach to data processing and data storage. Data generated at source has to be processed at source, to respond to the event or events which have happened. Cloud Computing, even with 5G networks, has latency that is not sufficient to how an autonomous vehicle react to pedestrians on the road at speed or how a sprinkler system is activated in a fire, or even a fraud detection system to signal money laundering activities as they occur.

Furthermore, not all sensors, devices, and IoT end-points are connected to the cloud at all times. To understand this new way of data processing and data storage, have a look at this video by Jay Kreps, CEO of Confluent for Kafka® to view this new perspective.

Data is continuously and infinitely generated at source, and this data has to be compiled, controlled and consolidated with nanosecond precision. At Storage Field Day 19, an interesting open source project, Pravega, was introduced to the delegates by DellEMC. Pravega is an open source storage framework for streaming data and is part of Project Nautilus.

Rise of  streaming time series Data

Processing data at source has a lot of advantages and this has popularized Time Series analytics. Many time series and streams-based databases such as InfluxDB, TimescaleDB, OpenTSDB have sprouted over the years, along with open source projects such as Apache Kafka®, Apache Flink and Apache Druid.

The data generated at source (end-points, sensors, devices) is serialized, timestamped (as event occurs), continuous and infinite. These are the properties of a time series data stream, and to make sense of the streaming data, new data formats such as Avro, Parquet, Orc pepper the landscape along with the more mature JSON and XML, each with its own strengths and weaknesses.

You can learn more about these data formats in the 2 links below:

DIY is difficult

Many time series projects started as DIY projects in many organizations. And many of them are still DIY projects in production systems as well. They depend on tribal knowledge, and these databases are tied to an unmanaged storage which is not congruent to the properties of streaming data.

At the storage end, the technologies today still rely on the SAN and NAS protocols, and in recent years, S3, with object storage. Block, file and object storage introduce layers of abstraction which may not be a good fit for streaming data.

Continue reading

Rebooting Infrascale

[ Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies to be presented at this event. The content of this blog is of my own opinions and views ]

Infrascale™ was relatively unknown for the Storage Field Day 19 delegates when they presented a few weeks ago in San Jose. Between 2015-2017, they have received several awards and accolades, including being in the Leaders quadrant for the 2017 Gartner Magic Quadrant for DR-as-a-Service.

I have known of Infrascale since 2016 as the BC and DR landscape was taking off back then, gravitating towards the cloud as a secondary platform for recovery.

Continue reading

Komprise is a Winner

[Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies to be presented at this event. The content of this blog is of my own opinions and views]

I, for one perhaps have seen far too many “file lifecycle and data management” software solutions that involved tiering, hierarchical storage management, ILM or whatever you call them these days. If I do a count, I would have managed or implemented at least 5 to 6 products, including a home grown one.

The whole thing is a very crowded market and I have seen many which have come and gone, and so when the opportunity to have a session with Komprise came at Storage Field Day 19, I did not carry a lot of enthusiasm.

Continue reading

Tiger Bridge extending NTFS to the cloud

[Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies to be presented at this event. The content of this blog is of my own opinions and views]

The NTFS File System has been around for more than 3 decades. It has been the most important piece of the Microsoft Windows universe, although Microsoft is already replacing it with ReFS (Resilient File System) since Windows Server 2012. Despite best efforts from Microsoft, issues with ReFS remain and thus, NTFS is still the most reliable and go-to file system in Windows.

First reaction to Tiger Technology

When Tiger Technology was first announced as a sponsor to Storage Field Day 19, I was excited of the company with such a cool name. Soon after, I realized that I have encountered the name before in the media and entertainment space.


Continue reading

Hadoop is truly dead – LOTR version

[Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies to be presented at this event. The content of this blog is of my own opinions and views]

This blog was not intended because it was not in my plans to write it. But a string of events happened in the Storage Field Day 19 week and I have the fodder to share my thoughts. Hadoop is indeed dead.

Warning: There are Lord of the Rings references in this blog. You might want to do some research. 😉

Storage metrics never happened

The fellowship of Arjan Timmerman, Keiran Shelden, Brian Gold (Pure Storage) and myself started at the office of Pure Storage in downtown Mountain View, much like Frodo Baggins, Samwise Gamgee, Peregrine Took and Meriadoc Brandybuck forging their journey vows at Rivendell. The podcast was supposed to be on the topic of storage metrics but was unanimously swung to talk about Hadoop under the stewardship of Mr. Stephen Foskett, our host of Tech Field Day. I saw Stephen as Elrond Half-elven, the Lord of Rivendell, moderating the podcast as he would have in the plans of decimating the One Ring in Mount Doom.

So there we were talking about Hadoop, or maybe Sauron, or both.

The photo of the Oliphaunt below seemed apt to describe the industry attacks on Hadoop.

Continue reading

AI needs data we can trust

[ Note: This article was published on LinkedIn on Jan 21th 2020. Here is the link to the original article ]

In 2020, the intensity on the topic of Artificial Intelligence will further escalate.

One news which came out last week terrified me. The Sarawak courts want to apply Artificial Intelligence to mete judgment and punishment, perhaps on a small scale.

Continue reading