'Data Fabric' in the New Era of Cloud Integration

'Data Fabric' in the New Era of Cloud Integration

Cloud innovators are enabling companies and federal agencies to fully leverage artificial intelligence and complex analytics in their business lines.

There is an ongoing push for organizations to migrate their data storage to public clouds, and advances in cloud technologies from NetApp and ThunderCat Technology allow both private conglomerates and federal agencies to embrace these digital modernization efforts.

In an interview with GovernmentCIO Media & Research, cloud computing experts from NetApp and ThunderCat detailed how the Data Fabric platform can enable both private conglomerates and federal agencies to embrace digital modernization.

"Traditionally they've kept their data in data centers on premise and stored and managed it very effectively there," said NetApp Deputy CTO Jim Cosby. "But now with the cloud, that's the new way people have looked to manage data for the past five or 10 years."

Still, Cosby acknowledged that for institutions with heavy information management demands, there are "challenges around how you get that data to the cloud and effectively manage it.”

This has led organizations to consider an intermediary hybrid cloud solution, a transitory state that allows them to adapt more gradually to the cloud while maintaining core functions and a certain degree of on-site data storage. In meeting this need, NetApp Data Fabric empowers organizations to establish both a comprehensive and flexible hybrid cloud storage network.

“Hybrid cloud is very interesting because it gives you a bridge to get between your traditional data center up to the cloud. So as an initial step, people start to take their data and figure, ‘How can I move it near the cloud, but maybe not in the cloud?’” Cosby said.

In discussing the approaches organizations with data-intensive business lines take to adopt cloud technology, ThunderCat Senior Cloud Architect Randy Pierce highlighted the inherent adaptability of hybrid cloud solutions.

“It makes sense for some customers to have the majority of their data on premise, but that flexibility to maybe burst or leverage some of the public cloud is definitely a compelling offer when we start talking to our customers about options,” he said.

While recognizing the considerable effort that goes into the public cloud transition, both Pierce and Cosby emphasized the manifold benefits and long-term payoff of embracing a more sophisticated data storage platform.

Pierce noted the capacity for public cloud storage to enable a caliber of data processing and integration that would have previously been impossible under a more fragmented storage model.

“When we started looking at NetApp Data Fabric, I like to think of it as: it allows us to go from 'data puddles' to 'data lakes.' So now we’re breaking down these data silos and we’re bringing all this data into a central location," he said. "Once we have that data in a central location in that data lake, start doing analytics that then allows us to have actual items that benefit the agency or business."

Similarly, Cosby emphasized the sheer processing power enabled by public cloud storage — a benefit that can enable leaps in processing in industries including health care, security and defense. This is especially conducive for organizations looking to embrace previously unfeasible artificial intelligence and machine-learning capacities, endpoints to which NetApp’s designers are keenly attentive.

“You could think about health care, which is a very large area with lots of diagnostic processing that happens to determine the right medicines, the right cure for illnesses," Cosby said. "If you think about the military developing autonomous vehicles that can go out to the battlefield and do the job where you used to have people involved in harm’s way, and now you’ve got that ability to protect people by having vehicles do that. So AI leads into that."

Standard