Thursday, January 31, 2019

How Casinos Are Gambling With Their Video Surveillance Storage - NetApp Certifications


Here’s some free advice: Don’t try to rob a casino.

It’s no secret that casinos are some of the most watched places on the planet. In a casino, hundreds of cameras are installed throughout every square inch of the building, giving security personnel high-definition, 24/7 visibility into what every patron and every employee is doing at any given time. They know which games you’re playing and for how long. They can track how much money you’re winning (or, more likely, losing), and what you do after you leave the table. They can even use advanced AI and analytics to predict your next move so that they can accurately staff bars and tables based on real-time activity.

They’re looking for card counters, whose disguises (glasses, mustaches, wigs, and hats) are no match for their facial recognition technology. (Hot tip: If you’re counting cards and the casino hasn’t kicked you out yet, it’s not because you’re really good. It’s because you’re really bad and they’ve decided to let you stay to lose more money.)

They’re looking for people with gambling addictions who have placed themselves on exclusion lists and can actually sue the casino if they’re allowed to play. And they’re looking for criminals who aren’t allowed in casinos by law.

Admittedly, much of the video captured by casinos is used after the fact for evidence. So even if you do succeed in pulling off an Ocean’s 11 heist, it probably won’t be long until the video is used to track you down.

Cameras are essential to the 24/7 operation of a casino. If even one camera goes down over a game table, the casino must shut down that table. If several cameras go down, they must close the entire floor, potentially losing thousands of dollars in revenue, to say nothing of the damage to their reputation. In the event of a shutdown, a casino can even be penalized by regulatory agencies, with fines reaching into the millions.

The Casino Storage Paradox


Yet, even with millions of dollars at stake, many casinos are still running outdated and unreliable video surveillance storage. They pump money into cameras and analytics software but prop them up with cheap, commodity storage. Traditional video deployments with low-cost, white-box digital video recorders (DVRs) are not only prone to failure, they’re also expensive to manage and extremely difficult to scale.

If you put cheap tires on a high-performance race car, you’re going to have a bad time. 

Casinos that aren’t thinking about the storage that their video surveillance infrastructure is running on are putting their reputations and their businesses at risk. With NetApp® E-Series systems, casinos don’t have to gamble with their video surveillance infrastructure. The NetApp E-Series video surveillance storage solution is designed for the highest levels of reliability, speed, and scalability. Easy manageability and low total cost of ownership make it a perfect choice for cost-conscious casinos.

Our experts say about NetApp Certification Exams



Sunday, January 20, 2019

NetApp CSO 2019 Perspectives - NetApp Certifications


As we enter 2019, what stands out is how trends in business and technology are connected by common themes. For example, AI is at the heart of trends in development, data management, and delivery of applications and services at the edge, core, and cloud. Also essential are containerization as a critical enabling technology and the increasing intelligence of IoT devices at the edge. Navigating the tempests of transformation are developers, whose requirements are driving the rapid creation of new paradigms and technologies that they must then master in pursuit of long-term competitive advantage.

1) AI projects must prove themselves first in the clouds

Still at an early stage of development, AI technologies will see action in an explosion of new projects, the majority of which will begin in public clouds.

A rapidly growing body of AI software and service tools – mostly in the cloud – will make early AI development, experimentation and testing easier and easier. This will enable AI applications to deliver high performance and scalability, both on and off premises, and support multiple data access protocols and varied new data formats. Accordingly, the infrastructure supporting AI workloads will also have to be fast, resilient and automated and it must support the movement of workloads within and among multiple clouds and on and off premises. As AI becomes the next battleground for infrastructure vendors, most new development will use the cloud as a proving ground.

2) IoT: Don’t phone home. Figure it out.

Edge devices will get smarter and more capable of making processing and application decisions in real time.

Traditional Internet of Things (IoT) devices have been built around an inherent “phone home” paradigm: collect data, send it for processing, wait for instructions. But even with the advent of 5G networks, real-time decisions can’t wait for data to make the round trip to a cloud or data center and back, plus the rate of data growth is increasing. As a result, data processing will have to happen close to the consumer and this will intensify the demand for more data processing capabilities at the edge. IoT devices and applications – with built-in services such as data analysis and data reduction – will get better, faster and smarter about deciding what data requires immediate action, what data gets sent home to the core or to the cloud, and even what data can be discarded.

3) Automagically, please

The demand for highly simplified IT services will drive continued abstraction of IT resources and the commoditization of data services.

Remember when car ads began boasting that your first tune up would be at 100,000 miles? (Well, it eventually became sort of true.) Point is, hardly anyone’s spending weekends changing their own oil or spark plugs or adjusting timing belts anymore. You turn on the car, it runs. You don’t have to think about it until you get a message saying something needs attention. Pretty simple. The same expectations are developing for IT infrastructure, starting with storage and data management: developers and practitioners don’t want to think about it, they just want it to work. “Automagically,” please. Especially with containerization and “server-less” technologies, the trend toward abstraction of individual systems and services will drive IT architects to design for data and data processing and to build hybrid, multi-cloud data fabrics rather than just data centers. With the application of predictive technologies and diagnostics, decision makers will rely more and more on extremely robust yet “invisible” data services that deliver data when and where it’s needed, wherever it lives. These new capabilities will also automate the brokerage of infrastructure services as dynamic commodities and the shuttling of containers and workloads to and from the most efficient service provider solutions for the job.

4) Building for multi-cloud will be a choice

Hybrid, multi-cloud will be the default IT architecture for most larger organizations while others will choose the simplicity and consistency of a single cloud provider.

Containers will make workloads extremely portable. But data itself can be far less portable than compute and application resources and that affects the portability of runtime environments. Even if you solve for data gravity, data consistency, data protection, data security and all that, you can still face the problem of platform lock-in and cloud provider-specific services that you’re writing against, which are not portable across clouds at all. As a result, smaller organizations will either develop in-house capabilities as an alternative to cloud service providers, or they’ll choose the simplicity, optimization and hands-off management that come from buying into a single cloud provider. And you can count on service providers to develop new differentiators to reward those who choose lock-in. On the other hand, larger organizations will demand the flexibility, neutrality and cost-effectiveness of being able to move applications between clouds. They’ll leverage containers and data fabrics to break lock-in, to ensure total portability, and to control their own destiny. Whatever path they choose, organizations of all sizes will need to develop policies and practices to get the most out of their choice.

5) The container promise: really cool new stuff

Container-based cloud orchestration will enable true hybrid cloud application development.

Containers promise, among other things, freedom from vendor lock-in. While containerization technologies like Docker will continue to have relevance, the de facto standard for multi-cloud application development (at the risk of stating the obvious) will be Kubernetes. But here’s the cool stuff… New container-based cloud orchestration technologies will enable true hybrid cloud application development, which means new development will produce applications for both public and on-premises use cases: no more porting applications back and forth. This will make it easier and easier to move workloads to where data is being generated rather than what has traditionally been the other way around.

Our experts say about NetApp Certification Exams



Tuesday, January 8, 2019

Exploring the Infinite Possibilities of AI with NetApp Certifications


“I think we are entering the golden decades of artificial intelligence,” Dave Hitz, NetApp’s founder and EVP, shared during an interview at NetApp Insight 2018 in Las Vegas. Futurist Gerd Leonhard agreed during his keynote: “Humanity will change more in the next 20 years than in the previous 300 years.”

How do companies make the most of their data in this brave new world of AI? The right partners hold the key. During Insight 2018, we explored the life-changing power of AI with leaders from NVIDIA and WuXiNextCODE.

The Big Data AI Train has Left the Station


NetApp’s own business model has shifted from storage to data and the cloud, and AI will be a driving force in our continued evolution. Bharat Badrinath, NetApp’s VP of Product and Solutions Marketing, shared, “AI has a profound benefit of changing how our customers operate; their entire operations have been transformed dramatically overnight.”

As Renee Yao, NVIDIA’s Senior Product Marketing Manager of Deep Learning & AI Systems, noted at Insight 2018, “We need to learn as fast and as much as we can. We can’t let the competition determine where our limit is; instead [we should only be limited by] what is possible—that is a fundamental mindset change in this AI revolution.”

Yao explained that the era of collecting and storing big data laid the foundation for this moment. Now, with the computational abilities of AI and deep learning, companies can process big data and optimize their systems in ways previously impossible.

She shared how NVIDIA, in partnership with NetApp, helped the Swiss Federal Railroad manage a system that carries more than a million passengers over 3,232 kilometers of track. The railway routes more than 10,000 trains a day, often traveling at up to 160 kilometers an hour.

Yao noted that, as a single train runs through the system, it passes 11 switches that move trains from one track to another along a route. Those 11 switches provide 30 different possible ways of routing a train. Add a second train to the system and the possible routes multiply to 900 combinations. By the 80th train, the number of possible route combinations is ten to the power of 80. “That’s more route combinations than the number of observed atoms in the universe,” Yao said, putting things into perspective.

Now, imagine safely routing the railway’s 10,000+ daily trains. “That’s more possibilities and more data than a human can calculate,” she said. Yet the railroad’s interworking switch system must ensure that all of those trains reach their destinations without colliding.

With NVIDIA DGX1 Station, a purpose-built AI workstation, the Swiss Federal Railroad is now able to simulate an entire day of train routes in just 17 seconds. Through AI, the railroad can simulate and map an unfathomable number of possible routes in less time than it takes to reheat a slice of pizza.

Must Have: Collaboration Between IT and Data Science Teams


This level of interaction between big data and AI requires an even closer, more in-step collaboration between a company’s IT and data science teams. Currently, data scientists often wait around as the IT team architects and tests new infrastructures. Likewise, when IT can’t anticipate the infrastructure needs of the data scientists, innovation can grind to a halt.

“No one can afford to be reactive,” Badrinath says. “Data scientists want to do the activation, but they can’t just go to the infrastructure team and say, ‘Hey! This is my workload—do something about it.'”

To help resolve this stalemate, NetApp and NVIDIA partnered to streamline organizational collaboration by bridging data and the cloud. Over the course of a year, NetApp integrated its systems with NVIDIA’s DGX1 supercomputer to create a single package for NetApp customers. This makes it easier for companies to deploy AI in a pre-validated system without requiring time-consuming handoffs across silos. By bringing IT and data science teams together, data and AI can interact seamlessly to fuel business innovation.

Genomics Data + AI = A Life-changing Solution


WuXiNextCODE showcases what this kind of collaboration could mean for our future. The company uses genomic sequence data to improve human health and knew that AI could unearth valuable—even life-saving— insights hidden in that data. They came to NetApp and NVIDIA with 20 years of historical data, totaling nearly 15 petabytes, but only two staff members to manage it all.

Using AI systems developed through the NetApp / NVIDIA partnership, WuXiNextCODE can produce huge simulations and rapid-fire queries to measure the impact of genetic mutations.

“In the past, sequencing was very slow,” Dr. Hákon Guðbjartsson, Chief Informatics Officer at WuXiNextCODE, shared during Insight 2018. He explained that a pediatrician would have to guess what gene was involved in a disease. But now, doctors can begin to narrow down possible mutations based on AI-processed sequence data.

There’s still a way to go, however: “Today, this process needs to be much more data-driven,” Dr. Guðbjartsson said. “You have millions of variants in each given individual, so you need more automation.” AI offers the way forward.

With the right partnerships and collaboration between IT and data science, innovators like WuXiNextCODE can harness the power of AI to do in seconds what previously couldn’t be done in a lifetime. As we stand at the brink of this new golden age, we’re more excited than ever about the infinite possibilities of AI.

Our experts say about NetApp Certification Exams



Three Tips for Cloud Service Provider Success - NetApp Certification

Gaining a competitive advantage as a cloud and hosting service provider is challenging in today’s market. Developing fully connected cu...