The Falcon to soar again

One of the historical feats which had me mesmerized for a long time was the 14-year journey China’s imperial treasures took to escape the Japanese invasion in the early 1930s, sandwiched between rebellions and civil wars in China. More than 20,000 pieces of the imperial treasures took a perilous journey to the west and back again. Divided into 3 routes over a decade and four years, not a single piece of treasure was broken or lost. All in the name of preservation.

Today, that 20,000 over pieces live in perpetuity in 2 palaces – Beijing Palace Museum in China and National Palace Museum Taipei in Taiwan

Digital data preservation

Digital data preservation is on another end of the data lifecycle spectrum. More often than not, it is not the part that many pay attention to. In the past 2 decades, digital data has grown so much that it is now paramount to keep the data forever. Mind you, this is not the data hoarding kind but to preserve the knowledge and wisdom which is in the digital content of the data.

[ Note: If you are interested to know more about Data -> Information -> Knowledge -> Wisdom, check out my 2015 article on LinkedIn ]

SNIA (Storage Networking Industry Association) conducted 2 surveys – one in 2007 and another in 2017 – called the 100 Year Archive, and found that the requirement for preserving digital data has grown multiple folds over the 10 years. In the end, the final goal is to ensure that the perpetual digital contents are

  • Accessible
  • Undamaged
  • Usable

All at an affordable cost. Therefore, SNIA has the vision that the digital content must transcend beyond the storage medium, the storage system and the technology that holds it.

The Falcon reemerges

A few weeks ago, I had the privilege to speak with Falconstor® Software‘s David Morris (VP of Global Product Strategy & Marketing) and Mark Delsman (CTO). It was my first engagement with Falconstor® in almost 9 years! I wrote a piece of Falconstor® in my blog in 2011.

Continue reading

Reap at low tide

[ Note: This article was published on Linkedin more than 6 months ago. Here is the original link to the article ]

[ Update (Apr 13 2020): Amid the COVID-19 pandemic and restricted movement globally,  we can turn our pessimism into an opportunistic one ]

Nature has a way of teaching us. What works and what doesn’t are often hidden in plain sight, but we human are mostly too occupied to notice the things that work.

Why are they not spending?

This news appeared in my LinkedIn feed. It read “Malaysian Banks Don’t Spend Enough on Tech“. It irked me immensely because in a soft economy climate (the low tide), our Malaysian financial institutions should be spending more on technology (reaping the opportunity) to get ahead.

Why are the storks and the egrets in my page photo above waiting and wading in the knee-deep waters? Because at low tide, when the waves ebb, food is exposed to them abundantly. They scurry for shrimps, small crabs, cockles, mussels and more. This is nature’s way.

From the report, the technology spending average among the Malaysian banks is pathetic.

No alt text provided for this image

The negative domino effect on SMEs

When the banks are not spending on technology, the other industries, especially the SMEs (small medium enterprises) follow suit. The “penny pinching” and “tightening purse string” effect permeates across industries, slowly and surely putting the negative effect in tech spending into a volatile spin-cycle.

From a macro-economic point of view, spending slows down. Buying less means lesser demands and effectively, lowering supply, and it rolls on. The law of demand and supply just got dumped into an abyss.

A great opportunity for those who see it

When I was an engineer at Sun Microsystems more than 2 decades ago, I read a comment delivered by one of the executives. It said “When times are bad, those who know will get the best parts“. I took his comment to heart because what he said held true, even until today.

This is the best time, when the country is experiencing an economic downturn. When the competitors are holding back and may be reeling from the negative effects of the economy, the banks are in the best position to grab the best deals. This is the time to gain market share, when the competition is holding back for fear that the economy will become softer.

Furthermore, with the low interest rates across the board, there is no better time than the present to step up the tech spending. Banks should know this very well but I am perplexed.

That is why the Malaysian banks must kick start their tech spending campaign now. And the SMEs will follow, overturning the downturn with demands of spending for the best “parts”. The supply “factories” are fired up again, and will lead to a positive growth to the economy.

Bank Negara RMiT is that one opportunity

One thing which has been looming is Bank Negara, Malaysia’s Central Bank, RMiT (Risk Management in Technology) framework. A new version was released in July 2019, and to me as an outsider, is a great opportunity to grab the best parts. And some of these standards will come into effect in January 2020

Bank Negara is strongly encouraging banks to improve the security and the confidence of the country’s financial industry, and the RMiT framework is really a prod to increase tech spending. Unfortunately, in some of my business interactions with a few of the banks, the feet dragging practice is prevalent.

Nature’s lesson

The best time to have your best pick is at low tide. This is nature’s lesson for us. What are we waiting for?

Iconik Content Management Solutions with FreeNAS – Part 2

[ Note: This is still experimental and should not be taken as production materials. I took a couple days over the weekend to “muck” around the new Iconik plug-in in FreeNAS™ to prepare for as a possible future solution. ]

This part is the continuation of Part 1 posted earlier.

iconik has partnered with iXsystems™ almost a year ago. iconik is a cloud-based media content management platform. Its storage repository has many integration with public cloud storage such as Google Cloud, Wasabi® Cloud and more. The on-premises storage integration is made through iconik storage gateway, and it presents itself to FreeNAS™ and TrueNAS® via plugins.

For a limited, you get free access to iconik via this link.

iconik  – The Application setup

[ Note: A lot of the implementation details come from this iXsystems™ documentation by Joe Dutka. This is an updated version for the latest 11.3 U1 release ]

iconik is feature rich and navigating it to setup the storage gateway can be daunting. Fortunately the iXsystems™ documentation was extremely helpful. It is also helpful to consider this as a 2-step approach so that you won’t get overwhelmed of what is happening.

  • Set up the Application section
    • Get Application ID
    • Get Authorization Token
  • Set up the Storage section
    • Get Storage ID

The 3 credentials (Application ID, Authorization Token, Storage ID) are required to set up the iconik Storage Gateway at the FreeNAS™ iconik plug-in setup.

Continue reading

Iconik Content Management Solutions with FreeNAS – Part 1

[ Note: This is still experimental and should not be taken as production materials. I took a couple days over the weekend to “muck” around the new Iconik plug-in in FreeNAS™ to prepare for as a possible future solution. ]

The COVID-19 situation goes on unabated. A couple of my customers asked about working from home and accessing their content files and coincidentally both are animation studios. Meanwhile, there was another opportunity asking about a content management solution that would work with the FreeNAS™ storage system we were proposing. Over the weekend, I searched for a solution that would combine both content management and cloud access that worked with both FreeNAS™ and TrueNAS®, and I was glad to find the iconik and TrueNAS® partnership.

In this blog (and part 2 later), I document the key steps to setup the iconik plug-in with FreeNAS™. I am using FreeNAS™ 11.3U1.

Dataset 777

A ZFS dataset assigned to be the storage repository for the “Storage Target” in iconik. Since iconik has a different IAM (identity access management) than the user/group permissions in FreeNAS, we have make the ZFS dataset to have Read/Write access to all. That is the 777 permission in Unix speak. Note that there is a new ACL manager in version 11.3, and the permissions/access rights screenshot is shown here.

Take note that this part is important. We have to assign @everyone to have Full Control because the credentials at iconik is tied to the permissions we set for @everyone. Missing this part will deny the iconik storage gateway scanner to peruse this folder, and the status will remain “Inactive”.  We will discuss this part more in Part 2.

Continue reading

4 Digital Workplace Moves after COVID-19

[ Note: This article was published on LinkedIn on March 24, 2020. Here is the link to the original article ]

We live in unprecedented times. Malaysia has been in Movement Control Order (MCO) Day 7, which is basically a controlled lockdown of movements and activities. In many cases, businesses have grounded to a halt, and the landscape has changed forever. The “office” will not always be a premise anymore, and the “meetings” will not be a physical face-to-face conversation to build relationships and trust.

Trust is vital. A couple of weeks ago, I wrote 關係 (Guan Xi), and having to re-invent Trust in a Digital World.

No alt text provided for this image

The impact on organizations and businesses is deep and powerful and so, as we move forward when the COVID-19 pandemic dies down, organizations’ plans in their Digital Transformation strategy will change as well.

Here are 4 technology areas which I think must take precedence for the Digital Workplace in the Digital Transformation strategy.

Software-Defined Wide Area Network (SD-WAN)

Physically connections have been disrupted. Digital connections are on the rise to supplant “networking” in our physical business world, and the pandemic situation just tipped the scale.

Many small medium businesses (SMBs) rely on home broadband, which may be good enough for some. Medium to large organizations have broadband for business. Larger organizations which have deeper pockets might already have MPLS (multiprotocol label switching) or leased line in place. A large portion might have VPN (virtual private network) set up too.

In time, SD-WAN (software-defined wide area network) services should be considered more profoundly. SD-WAN is a more prudent approach that inculcates digital workplace policies such as quality of service (QOS) for critical data connections, allocating network attributes to different data workloads and network traffic, VPN features and most come with enhanced security addendum as well. .

In addition to performance, security and capacity control, SD-WAN implementation helps shape employees’ digital workplace practices but most importantly, redefine the organization’s processes and conditioning employees’ mindsets in the Digital Transformation journey.

 

Video Meetings & Conferencing

The Video Meetings and Conferencing solutions have become the poster child in the present pandemic situation. Zoom, Webex, Microsoft Teams, Skype (it is going away), GoToMeetings and more are dominating the new norm of work. Work from home (WFH) has a totally new meaning now, especially for employees who have been conditioned to work in an “office”.

I had more than 15 Zoom meetings (the free version) last week when the Malaysian MCO started, and Zoom has become a critical part of my business now, and thus, it is time to consider paid solutions like Zoom or Webex as part of an organization’s Digital Workplace plans. These will create the right digital culture for the new Digital Workplace.

Personally I like Uberconference because of their on-hold song. It is sang by their CEO, Alex Cornell. Check out this SoundCloud recording.

File Sharing

Beneath the hallowed halls of video meetings and conferencing, collaboration happens with data shared in files. We have been with file and folders from our C: drives or NAS Home Directories or File Server’s shared drives that these processes are almost second nature to us.

In the face of this COVID-19 pandemic, files and information sharing has become cumbersome. The shared drive is no longer in our network, because we are not in the organization’s LAN and intranet anymore. We are working at home, away from the gigabit network, protected by the organization’s firewall, and was once slaved … err … I mean supported by our IT admins.

The obvious reaction (since you can’t pass thumb drives anymore at present) is to resort to Dropbox, OneDrive, Google Drive and others, and hoping you won’t max out your free capacity. Or email attachments in emails going back and forth, and hoping the mail server will not reject files larger than 10MB.

The fortunate ones have VPN client on their laptops but the network backhaul traffic to the VPN server at the central VPN server, and overloading it to the max. Pretty soon, network connections are dropped, and the performance of file sharing sucks! Big time!

What if your organization is a bank? Or an Oil & Gas company where data protection and data sovereignty dictate the order of the day? All the very-public enterprise file sync and share (EFSS) like Dropbox or OneDrive or Google Drive totally violate the laws of the land, and your organization may be crippled by the inability to do work. After all, files and folders are like the peanut-butter-jelly or the nasi lemak-teh tarik (coconut rice & pulled tea Malaysian breakfast) combo of work. You can’t live without files and folders.

The thoughts of having a PRIVATE on-premises EFSS solution in your organization’s Digital Transformation strategy should be moved from the KIV (keep in view) tray to a defined project in the Digital Transformation programme.

At Katana Logic, we work with Easishare, and it is worth having a serious plan about building your own private file share and sync solution as part of the Digital Workplace.

Security

In such unprecedented times, where our attention is diverted, cybersecurity threats are at its highest. Financial institutions in Malaysia have already been summoned by Malaysia Bank Negara central bank to build the industry’s expectations and confidence through the RMiT framework. Conversations with some end users and IT suppliers to Malaysian banks and other financial institutions unfortunately, revealed the typical lackadaisical attitude to fortify cyber resiliency practices within these organizations. I would presume the importance of cybersecurity and cyber resiliency practices would take a even further back seat with small medium businesses.

On a pessimistic note, ransomware and DDOS (distributed denial-of-service) have been on the rise, and taking advantage of this pandemic situation. NAS, the network attached storage that serves the organization shared files and folders has become ransomware’s favourite target as I have wrote in my blog.

But it does not have to be expensive affair with cybersecurity. Applying a consistent periodical password change, educating employees about phishing emails, using a simple but free port scanners to look at open TCP/UDP ports can be invaluable for small medium businesses. Subscribing to penetration testing (pentest) services at a regular frequency is immensely helpful as well.

In larger organizations, cyber resiliency is more holistic. Putting in layers for defense in depth, CIA (confidentiality, integrity, availability) triad, AAA (authentication, authorization, audit) pro-active measures are all part of the cybersecurity framework. These holistic practices must effect change in people and the processes of how data and things are shared, used, protected and recovered in the whole scheme of things.

Thus organizations must be vigilant and do their due diligence. We must never bat any eye to fortify cyber security and cyber resiliency in the Digital Workplace.

Parting thoughts

We are at our most vulnerable stage of our lifetime but it is almost the best time to understand what is critical to our business. This pandemic is helping to identify the right priorities for Work.

At any level, regardless, organizations have to use the advantage of this COVID-19 situation to assess how it has impacted business. It must look at what worked and what did not in their digital transformation journey so far, and change the parts that were not effective.

I look at the 4 areas of technology that I felt it could make a difference and I am sure there are many more areas to address. So, use this pessimistic times and turn it into an optimistic one when we are back to normalcy. The Digital Workplace has changed forever, and for the better too.

Continue reading

NetApp double stitching Data Fabric

Is NetApp® Data Fabric breaking at the seams that it chose to acquire Talon Storage a few weeks ago?

It was a surprise move and the first thing that came to my mind was “Who is Talon Storage?” I have seen that name appeared in Tech Target and CRN last year but never took the time to go in depth about their technology. I took a quick check of their FAST™ software technology with the video below:

It had the reminiscence of Andrew File System, something I worked on briefly in the 90s and WAFS (Wide Area File System), a technology buzz word in the early to mid-2000s led by Tacit Networks, a company I almost joined with a fellow NetApp-ian back then. WAFS DNA appeared ingrained in Talon Storage, after finding out that Talon’s CEO and Founder, Shirish Phatak, was the architect of Tacit Networks 20 years ago.

Continue reading

StorageGRID gets gritty

[ Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies presented at the event. The content of this blog is of my own opinions and views ]

NetApp® presented StorageGRID® Webscale (SGWS) at Storage Field Day 19 last month. It was timely when the general purpose object storage market, in my humble opinion, was getting disillusioned and almost about to deprive itself of the value of what it was supposed to be.

Cheap and deep“, “Race to Zero” were some of the less storied calls I have come across when discussing about object storage, and it was really de-valuing the merits of object storage as vendors touted their superficial glory of being in the IDC Marketscape for Object-based Storage 2019.

Almost every single conversation I had in the past 3 years was either explaining what object storage is or “That is cheap storage right?

Continue reading

Paradigm shift of Dev to Storage Ops

[ Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies presented at the event. The content of this blog is of my own opinions and views ]

A funny photo (below) came up on my Facebook feed a couple of weeks back. In an honest way, it depicted how a developer would think (or the lack of thinking) about the storage infrastructure designs and models for the applications and workloads. This also reminded me of how DBAs used to diss storage engineers. “I don’t care about storage, as long as it is RAID 10“. That was aeons ago 😉

The world of developers and the world of infrastructure people are vastly different. Since cloud computing birthed, both worlds have collided and programmable infrastructure-as-code (IAC) have become part and parcel of cloud native applications. Of course, there is no denying that there is friction.

Welcome to DevOps!

The Kubernetes factor

Containerized applications are quickly defining the cloud native applications landscape. The container orchestration machinery has one dominant engine – Kubernetes.

In the world of software development and delivery, DevOps has taken a liking to containers. Containers make it easier to host and manage life-cycle of web applications inside the portable environment. It packages up application code other dependencies into building blocks to deliver consistency, efficiency, and productivity. To scale to a multi-applications, multi-cloud with th0usands and even tens of thousands of microservices in containers, the Kubernetes factor comes into play. Kubernetes handles tasks like auto-scaling, rolling deployment, computer resource, volume storage and much, much more, and it is designed to run on bare metal, in the data center, public cloud or even a hybrid cloud.

Continue reading

DellEMC Project Nautilus Re-imagine Storage for Streams

[ Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies presented at this event. The content of this blog is of my own opinions and views ]

Cloud computing will have challenges processing data at the outer reach of its tentacles. Edge Computing, as it melds with the Internet of Things (IoT), needs a different approach to data processing and data storage. Data generated at source has to be processed at source, to respond to the event or events which have happened. Cloud Computing, even with 5G networks, has latency that is not sufficient to how an autonomous vehicle react to pedestrians on the road at speed or how a sprinkler system is activated in a fire, or even a fraud detection system to signal money laundering activities as they occur.

Furthermore, not all sensors, devices, and IoT end-points are connected to the cloud at all times. To understand this new way of data processing and data storage, have a look at this video by Jay Kreps, CEO of Confluent for Kafka® to view this new perspective.

Data is continuously and infinitely generated at source, and this data has to be compiled, controlled and consolidated with nanosecond precision. At Storage Field Day 19, an interesting open source project, Pravega, was introduced to the delegates by DellEMC. Pravega is an open source storage framework for streaming data and is part of Project Nautilus.

Rise of  streaming time series Data

Processing data at source has a lot of advantages and this has popularized Time Series analytics. Many time series and streams-based databases such as InfluxDB, TimescaleDB, OpenTSDB have sprouted over the years, along with open source projects such as Apache Kafka®, Apache Flink and Apache Druid.

The data generated at source (end-points, sensors, devices) is serialized, timestamped (as event occurs), continuous and infinite. These are the properties of a time series data stream, and to make sense of the streaming data, new data formats such as Avro, Parquet, Orc pepper the landscape along with the more mature JSON and XML, each with its own strengths and weaknesses.

You can learn more about these data formats in the 2 links below:

DIY is difficult

Many time series projects started as DIY projects in many organizations. And many of them are still DIY projects in production systems as well. They depend on tribal knowledge, and these databases are tied to an unmanaged storage which is not congruent to the properties of streaming data.

At the storage end, the technologies today still rely on the SAN and NAS protocols, and in recent years, S3, with object storage. Block, file and object storage introduce layers of abstraction which may not be a good fit for streaming data.

Continue reading

Rebooting Infrascale

[ Disclosure: I was invited by GestaltIT as a delegate to their Storage Field Day 19 event from Jan 22-24, 2020 in the Silicon Valley USA. My expenses, travel, accommodation and conference fees were covered by GestaltIT, the organizer and I was not obligated to blog or promote the vendors’ technologies to be presented at this event. The content of this blog is of my own opinions and views ]

Infrascale™ was relatively unknown for the Storage Field Day 19 delegates when they presented a few weeks ago in San Jose. Between 2015-2017, they have received several awards and accolades, including being in the Leaders quadrant for the 2017 Gartner Magic Quadrant for DR-as-a-Service.

I have known of Infrascale since 2016 as the BC and DR landscape was taking off back then, gravitating towards the cloud as a secondary platform for recovery.

Continue reading