governance

The Mehta-Data Podcast | Episode 3: Are Governance Issues A Concern In Your 2019 Budget Preparations?

August/September 2018 Edition

As corporations prepare for their 2019 annual budget, many enterprises are concerned with how to prepare for their digital enterprise needs, and how those decisions may need to adhere to corporate and industry governance.  Data Dynamics CEO Piyush Mehta, the “Dean of Data,” addressed the specifics on budgeting and governance issues in his recent Mehta-Data© podcast. 

Why is governance an issue in data management and migration?

There are many aspects to which data governance is looked at because of the sheer amount of data and the importance of that data in every vertical marketplace.  Governance really is encompassed of four key aspects or attributes.

One is availability and the available access to such data for usability. How many people can use the data, and how can it accurately be made available.

The second is content security.  Enterprises need to ensure that their data is correct in terms of its content, and such security ensures that those who need to access the data can do such…and only those that should access the data will have access.

This ties to the third area of unstructured data. The largest portion of data growth comes from unstructured data. Most organizations have 60-plus percent of their data today in an unstructured format.

And, the fourth area is, by far, the most rapidly growing aspect of their data infrastructure, that being regulatory issues.  Changes in regulatory issues challenge every enterprise relative to how they monitor and maintain these regulatory issues.

How do regulatory issues specifically affect the governance of data management and migration?

Primarily, if looked at from a regulatory standpoint, I believe this is the wrong approach.   GDPR in Europe and definitely Sarbanes-Oxley in the US put emphasis on how data is managed and accessed.

An organization really needs to look at data governance more from an internal efficiency standpoint rather than regulations, driving capability in helping it manage its lifecycle. At the end of the day, governance is in place to drive effectiveness and efficiency in the control and management of the data, rather than it being there for administration.

The future of governance security issues are important. But, the challenge is who is accessing data and how.

Does this relate purely to having the proper rights and privileges to access?

As you know, external sources look to pry into data, both personal as well as business confidential information.  The major challenge is whether different departments should or should not have access to proprietary and/or personnel information within a company, which is a key aspect challenge of a business.

This is further complicated by mergers and acquisitions, where you have different access points for different records. Making sure that these are not aimlessly transpired so that management and governance access controls can be limited, is beneficial as everyone moves to the hybrid cloud is imperative to every industry.

How does governance relate to such a hybrid cloud scenario?

At the end of the day, you have to make sure that you do have a team of reliable data stewards. These are the people that are going to be responsible for the overall management of the governance, and this could include and should include both management and contribution lines of business, as well as the information technology teams that can drive the technology aspects.

When you take these data stewards and apply the necessary compliance, whether the data sits locally or it is in the public cloud, or maybe a hybrid model, wouldn’t really matter. Because you are now applying the same standards along with the technology and processes required to manage that data.

How are organizations tackling this governance issue in the big picture you just detailed?

As I mentioned, most people are looking at governance in the wrong way, and see it as more regulatory than internal efficiency driven. When people are running a disaster recovery test as a means of governance, it is done to make sure that it passes to avoid any kind of risk relative to the compliance issues from an external governance body.

When they do this, they actually plan out the entire data recovery (DR) process proactively and ahead of time, rather than just flipping a switch. They just want to determine governance compliance.

Yet, the data can be moved to a secondary site while maintaining the same access as before. This should be embraced because it establishes better management rather than rigorous planning, not because it meets regulations.

At the end of the day, when a disaster happens instantaneously or over a period of time, your organization must be ready.  Make sure that you are embracing governance from a productivity standpoint rather than a checkbox that needs to be done for regulation’s sake.

How can such solutions for a data management organization help address these challenges relative to scale, both geographically and in terms of sheer data volume?

There are three key pillars for data stewards, the first being the area of compliance, and creating the standards which are the processing element.

Second is the technology element and having software solutions that can scale at a global level to deploy and manage the compliance required around these established standards with tens of billions of files, if not hundreds of petabytes of data.

The third is the ability for your software to meet this type of scalability, which is important.  This ties in with the ability to understand and address the security aspects of such.

This availability and accuracy, along with the obvious usability, provides a good technology platform to facilitate governance issues for fiscal and organizational calendar years. And this is where the enterprise budget comes into place.

How can budgeting affect becoming a data enterprise?

The challenge with budgets a continuing game that most large enterprises play.  There is never enough money to address things proactively, but always an open checkbook to correct a problem or challenge reactively when such is considered mission-critical or critical.

It is important to budget accurately for people and time by the data stewards, in terms of establishing and maintaining the proper compliance and standards, and then spending on the software and technologies required to automate, manage, and proactively leverage artificial intelligence to drive a proactive approach to governance. The reactive aspect which most organizations use is when infiltration access to data happens most often.

When a problem occurs, whether it is infrastructure-related or people – related process challenge, enterprises often fund an unimaginable amount of money without hesitation. So, I believe this proactive budgeting plan can avoid most reactive issues.

Based upon what you just described, what should an enterprise be doing right now to prepare for 2019 when it comes time to allocating budget items?

I think it’s important to talk to the right constituents and potential partner experts so that they can put the proper team in place and to align the team to make sure that standards are set in accomplishing a proactive governance program.  Different software platforms can be researched and reviewed to ensure the proper value and technology is acquired to address reactive and proactive compliance of standards.

Partnerships should be fully tested and proven out in order to ensure the proper scale can be utilized to leverage a 2019 implementation plan.  Enterprises should also talk about testing several aspects with available discretionary funds for the rest of 2018 to best determine next year’s budget and solutions acceptability.

How can a company better become a data enterprise without a specific budget allocation?

Small steps can be initiated when initially addressing reactive solutions. It’s hard to deploy them at a global level, but you can talk to companies that offer software platforms which might be tried on a subset of the environment.

It is vital to test these and ensure that such solutions meet or exceed your standards and requirements from a compliance or data governance standpoint to proactively approach the problem in a step-by-step fashion, rather than as a piecemeal conclusion.

Such holistic approaches assist in time and budgetary issues management which can be avoided.

What if people should need additional information about governance and budgeting?

They can find anything online at www.datadynamicsinc.com. We have a ton of information available for them and would be happy to provide further guidance from one of our subject matter experts.

Another quick and easy way is to go to our website at www.datdyn.com, where we have created a survey based on customer responses. What we found is a lot of customers are asking us typical questions that allow and help them to understand where and how they can get to digitalization.

We created the survey right on the top of the website. Take a look, and it can help guide you on the journey, and to how best to digitalize based on the survey responses, and where you’re at in the journey.

 

                                    

If you are interested in moving to the cloud for storage savings and better business analytics, you can get a cost of ownership (TCO) comparison and a list of recommendations by completing a short survey on the Data Dynamics website, www.datadynamicsinc.com for a free assessment.

You can also subscribe to our Soundcloud channel and get access to all podcasts!

benefits of storageX

The 7 Ways Managing Your Own Data Can Give You the Advantage

Have you experienced vendor lock-in? Is there an easier way to manage and store your data? Data management can play a huge factor in productivity and efficiency. Understand the benefits of StorageX and how you can transition to a rich set of capabilities that are the building blocks to Dynamic File Management for the Digital Enterprise.  

The 7 Benefits of StorageX

  1. Analytics: Know what you have and where it is with intelligent views. StorageX provides powerful file analytics to tell you what you have so you can turn your data from a risk to a strategic asset
  2. File Migration: With actionable data move your files with ease across any technology. Easily merge with another company and more.   
  3. File Replication: Copy files for backup, disaster recovery, DevOps, or performance. Don’t be tied down to one vendor, know that your company’s assets are in good hands.  
  4. Archive Retrieval: Even though it is not needed on a daily basis, organize your archived files for easy access. Keep archive data at your fingertips. Search and retrieve data for audits discovery and analytics.   
  5. File Archival: Centrally manage data from “live” to “archive” and optimize data placement for compliance needs. Organize your files to make it easy for anyone within your te
    am to find what they need.
  6. Namespace Management: Fully utilize Microsoft’s’ Distributed File System (DFS) capabilities from one console. Customize the organization of the DFS namespace to fit your company.
  7. Software Developer’s Kit: Integrate dynamic file management into key business application with RESTful API’s. The developer kit features options to integrate data movement with the cloud for file archival and disaster protection.
  8. Application Modernization API: Your Applications in control of moving data to the cloud and back. Cloud data done your way. Move files easily across heterogeneous storage resources, placing data where you need and when you need it.

benefits of StorageX

With the facts stated above, it’s evident that Dynamic File Management was created to ease the pain of not being able to access and store files. The benefits of StorageX will lead your company to success and prepare yourself for the cloud.

If you want to start your digital transformation today or have more questions, we would love to help. Request a call today and one of our experts will contact you as soon as possible. Or even better, request a demo and see it all in action.

effectively manage data

Eliminating The Chaos: Effectively Manage Data

Enterprises are unable to effectively manage data due to technology lock-in, complexity, and risk. What is technology lock-in you might ask? Technology lock-in is the idea that the more a society adopts a certain technology, the more unlikely users are to switch. For example, the continued prevalence of the QWERTY keyboard layout is said to be caused by technological lock-in. 

effectively manage data

The Problems With Managing Data Today

Growth continues to impede an enterprise’s capabilities to manage data. Over the next year, there will be a 59% increase in worldwide data growth. Making it impossible for enterprises to strategically join the cloud. They have also tried to effectively manage data with archaic tools and manual processes – resulting in an IT system that can’t keep up, and not knowing what data they do have. With that being said, enterprises move data from one system to another instead of building value.

Execute a Dynamic File Management Strategy

What can you do to successfully and effectively manage data in the cloud? Utilize the intelligence offered, make sure you’re in control with Dynamic Data for the Digital Enterprise. Analyze your data by knowing what you have and where it is. Have the capabilities to unlock your files and move them with ease. Control, manage and synchronize file resources on demand. Lastly, modernize and always integrate dynamic file management as technology moves forward.   

Be in Control & You Can Effectively Manage Data

Curious about how controlling your data will affect your company? It has been proven that properly managed data increases by 10x and reduces costs by 50%. Unlock your data assets, prevent vendor lock-in, and uncover the true value of efficiently managed data with Data

If you want to start your digital transformation today or have more questions, we would love to help. Request a call today and one of our experts will contact you as soon as possible. Or even better, request a demo and see it all in action.

A Dilemma That Will Impact Many Startups in 2018!

The past 6 years have seen a ton of start-ups in the infrastructure management space. There has been a flood of new companies created to address the massive challenge of managing the plumbing and underlying compute and storage that facilitates the growth of the Internet of Things, Public and Private Cloud and Artificial Intelligence. The ease of raising funding from Angels, Venture and Private Equity has provided entrepreneurs an easy source of capital. What a great time to be a start-up….or is it? I equate the existing start-up space like a gambler (in this case an investor) debating whether to bet more chips on another hand, double down on the one that exists or walk away from the table.

2017 saw almost 4 billion internet users, Gartner predicted 8 billion things would have connected and AI became more prevalent and started to make an impact in our daily lives. We truly are in an information technology renaissance with the world becoming virtually smaller, new innovations shaping our daily routines and transforming the workforce to meet the needs of a digital economy. Every day we see new technologies make front page news, from online shopping to self-driving cars to cryptocurrency to robotics and AI. Each and every one of these next generation innovations requires core/edge computing capability, tons of storage to keep data that can be mined and utilized and networks through which the information can flow across the globe. The underlying infrastructure is evolving with new innovations from legacy vendors and start-ups to meet the needs of the market.

This exciting era of technology has led to crowd funding, angels, super angels, venture capitalists for different stages of growth and private equity all pumping money into new ideas and companies. From raising thousands to raising billions, the opportunity to stay private and raise as much capital as required has been the mantra utilized by most start-ups, avoiding the scrutiny of the public markets and all that comes with it. Venture and private equity funds have raised tens, if not hundreds of billions of dollars to invest in the next Amazon or Alibaba! Nobody wants to miss the party and everyone wants ‘in’ on the 10-20-50x return that waits upon an exit! This is analogous to sitting at a blackjack table and everyone around is winning so the enthusiasm keeps building, players keep increasing their bets, doubling down as there are no signs of a losing hand. Investors see their other investments or their peers making multi-fold via a unicorn exit and the exuberance continues in stride.

Unfortunately most of the start-ups never think about profitability and focus solely on customer acquisition, top line growth or worse yet, number of users/clicks without any direct correlation to financial metrics. This focus on customer acquisition, top line growth is an essential component of a company’s growth curve but at some stage there has to be a means to profitability. Start ups in today’s world don’t worry about profit as they are more focused on raising the next round of funding and then the next and the next and before long the company has raised tens if not hundreds of millions without earning a dollar. What’s amazing to me is that investors continue to do round after round of investment despite knowing that throwing good money behind bad doesn’t make sense. The challenge is, once they are ‘in’, they have to keep on investing as they need to show their limited partners (LP’s) that the investments they’ve made are continuing to progress forward. Keeping the blackjack analogy in mind, think of the same table that is full of exuberance and a couple of the players lose a hand or two. The gambling mindset is one where the loss is a fluke, it won’t happen to me or it definitely won’t happen two times in a row. If the gambler keeps playing and maybe even increasing the stake, a win will yield rewards.

What you will see in 2018 is that a large majority of the start ups will end up closing or being sold for pennies on a dollar. The reason is not because the technologies are not good, it is because the companies are not profitable, they are not within site of being profitable and the investment dollars for new capital is drying up. There are several reasons for investors not willing or able to invest further. First, the investors have a time horizon that may be coming due. Most funds have a ‘life’ for each fund raised, typically 10 years from inception, so a fund is bound to exit from what it has invested in before the time horizon runs out. Secondly, to go public requires delivering on numbers. The public markets are rewarding companies that meet or exceed forecasts and just as harshly killing those that don’t. Financials do matter and the public market is clear that you must show profitability or a means to it, in order to continue to be supported with a strong share price. There are exceptions to this but even those exceptions face a crazy roller coaster ride to their share price. The other option is a private exit, an M&A to a strategic. The challenge to this is that most large companies are extremely smart and have fairly mature M&A processes, not to mention activist investors that are monitoring every major spend. They are not going to pay multiples if they know the company is going to run out of money and is on its last breath. In addition, they will not want to take on a transaction unless it is strategic and can be additive to their earnings, or has a diminutive short term negative earnings impact. Going back to my gambling analogy…the gambler has a flight to catch and needs to leave the table pretty soon and he/she must decide what to do, should they bet more chips and double down, take a new hand or simply walk away? My feeling is that many in 2018 will either take ‘even money’ or take the loss and walk away!

To my fellow entrepreneurs, we are the dealers of each hand, making the gambler win is in our best interest. Focus on profit and the analogy ‘the house never loses’ will definitely come to fruition. Best of luck in 2018!

Cloud Computing Storage Infrastructure

The Cloud: Transforming How We Manage Storage Infrastructure

When I first heard of ‘The Cloud,’ I thought it was just marketing jargon used by technology companies to create a false new market.

In reality, The Cloud, in its various forms, is re-defining how we access, utilize, and manage software, hardware, and IT services.

Continue reading The Cloud: Transforming How We Manage Storage Infrastructure

Manage Orphaned Data

How to Manage Your Orphaned Data

With exception of structured data, many companies are unaware of which files are present within many Windows shares or NFS exports.  Over time, data have moved from department to department, project to project. It’s been created, unused, and left orphaned by users leaving the company or corporate restructurings.

Managing Orphaned Data with StorageX 8.0

StorageX 8.0 introduces our File Analytics web portal.  The web portal displays a dashboard representing the results of data scans and subsequent analysis.  Each data scan interrogates a specified share, export, or multiple shares and exports.  The scan tags and compiles the file metadata into the file analytics database.  Once the metadata is in the database, we can query the tags and metadata to narrow down the scope of data that is of concern.

File metadata can be used to help a company determine the use, ownership, file type, file size, creation date, access date, last modify date, and many other criteria that can be used to make decisions about where the data should be stored.  For purposes of this discussion, we will focus on ownership—specifically, unowned or orphaned data.

Read the rest of the whitepaper. 

Want to learn more about StorageX?

StorageX Analysis Portal Demo

StorageX Petabyte Scale Migration

 

Object Storage Popularity

Why is Object Storage Growing in Popularity?

Object Storage is the buzz of the storage industry and for good reason: it encompasses file, block, and object access portals into the same pool of raw object storage.

Compared to traditional file and block storage, object storage offers massive petabyte scalability and built-in availability. It’s a distributed design model that removed the potential of a single drive failure. Object storage nodes are combined to enable unlimited capacity and consistent access performance.

At Data Dynamics, we are very excited about the potential for object storage. In the recent release of StorageX 8.0, we unveiled new file-to-object conversion to support S3-compliance object storage. Based on feedback from our customers, file-to-object conversion was the number one requested new feature.

StorageX is the file management solution of choice for 6 of the 12 largest banks in the world.  Large banks that actively manage petabytes of data rely on object storage for its massive scalability, availability and economies of scale.  One of our customers, whose name we cannot reveal, was kind enough to share how they recently completed a 10 PB filesystem refresh using StorageX.

When measured across all our customers, the benefits delivered by StorageX are truly incredible:

  • Reduce storage-related operational costs 50%
  • Deploy new storage technology 66% faster
  • Modernize applications for 10X productivity improvement

Data Dynamics is proud to work with all its technology partners who share our passion for object storage.  We are witnessing a MAJOR shift in the storage industry and we are excited about the future potential of object storage.

Get on the Cloud

How DevOps Adoption is Changing

To gather insights on the state of DevOps, we spoke with 22 executives at 19 companies implementing DevOps for themselves and helping clients to implement a DevOps methodology. We asked, “How has DevOps changed since you began using the methodology?” Here’s what they told us:

Adoption

  • As I talk to customers and prospects there’s greater awareness of DevOps and what it can do. It’s being taken more seriously. There is sufficient proof of organizations doing well with DevOps. This is a business process change.
  • A survey we just conducted reflected the insurgency within companies to the way things are done. One year ago, it was, “what is DevOps?” Today there’s a common understanding with a desire to know how to scale.
  • CD has become mainstream. Microservices are more commonplace and are a good way to be successful with DevOps. Being in the cloud gives you more flexibility. Containers are becoming more mainstream. Function as a service is a helpful way to solve scaling issues.
  • People are beginning to understand the benefits of DevOps. Best practices have been solidified. Allows you to get code from developers to customers in a fast and secure way.

Read the Full Article

StorageX is fast (baseline copy)

In a previous article, I stated that StorageX is multi-threaded. I also spent quite a bit of time discussing why I consider this fact to be (mostly) irrelevant to the administrator who is using StorageX to perform his file system migrations. What the user of StorageX really wants is for StorageX to do its job as fast as possible: when he is doing a baseline copy, he wants StorageX to fill his network pipe and move the data as quickly as possible, and when he is cutting over to his shiny new NAS hardware, he wants StorageX to do the final incremental copy within his allotted cutover window.

As I mentioned in my previous article, the techniques StorageX uses to fill the network pipe during a baseline copy are very different from those used to find changed files as quickly as possible during an incremental copy. In this article, I will focus on baseline copies.

Continue reading StorageX is fast (baseline copy)