IBM And NVIDIA Power New Scale-Out Gear For AI

Accelerating deep learning (DL) training – on GPUs, TPUs, FPGAs or other accelerators – is in the early days scale-out architecture, like the server market was in the mid-2000s. DL training enables the advanced pattern recognition behind modern artificial intelligence (AI) based services. NVIDIA GPUs have been a major driver for DL development and commercialization, but IBM just made an important contribution to scale-out DL acceleration. Understanding what IBM did and how that work advances AI deployments takes some explanation.

Scale Matters

TIRIAS Research

Key Definitions

Inference scales-out. Trained DL models can be simplified for faster processing with good enough pattern recognition to create profitable services. Inference can scale-out as small individual tasks running on multiple inexpensive servers. There is a lot of industry investment aimed at lowering the cost of delivering inference, we’ll discuss that in the future.

The immediate challenge for creating deployable inference models is that, today, training scales-up. Training requires large data sets and high numeric precision; aggressive system designs are needed to meet real-world training times and accuracies. But cloud economics are driven by scale-out.

The challenge for cloud companies deploying DL-based AI services, such as Microsoft’s Cortana, Amazon’s Alexa and Google Home, is that DL training has not scaled well. Poor off-the-shelf scaling is mostly due to the immature state of DL acceleration, forcing service providers to invest (in aggregate) hundreds of millions of dollars in research and development (R&D), engineering and deployment of proprietary scale-out systems.

NVLink Scales-Up in Increments of Eight GPUs

GPU evolution has been a key part of DL success over recent years. General purpose processors were, and still are, too slow at processing DL math with large training data sets. NVIDIA invested early in leveraging GPUs for DL acceleration, in both new GPU architectures to further accelerate DL and in DL software development tools to enable easy access to GPU acceleration.

An important part of NVIDIA’s GPU acceleration strategy is NVLink. NVLink is a scale-up high-speed direct GPU-to-GPU interconnect architecture that directly connects two to eight GPU sockets. NVLink enables GPUs to train together with minimum processor intervention. Prior to NVLink, GPUs did not have the low-latency interconnect, data flow control sophistication, or unified memory space needed to scale-up by themselves. NVIDIA implements NVLink using its SXM2 socket instead of PCIe.

NVIDIA’s DGX-1, Microsoft’s Open Compute Project (OCP) Project Olympus HGX-1 GPU chassis and Facebook’s “Big Basin” server contribution to OCP are very similar designs that each house eight NVIDIA Tesla SXM2 GPUs. The DGX-1 design includes a dual-processor x86 server node in the chassis, while the HGX-1 and Big Basin designs must be paired with separate server chassis.

Microsoft’s HGX-1 can bridge four GPU chassis by using its PCIe switch chips to connect the four NVLink domains to one to four server nodes. While all three designs are significant feats of server architecture, the HGX-1’s 32-GPU design limit presents a practical upper limit for directly connected scale-up GPU systems.

TIRIAS Research

Microsoft HGX-1 motherboard with eight SXM2 sockets (four populated)

The list price for each DGX-1 is $129,000 using NVIDIA’s P100 SXM2 GPU and $149,000 using its V100 SXM2 GPU (including the built-in dual-processor x86 server node). While this price range is within reach of some high-performance computing (HPC) cluster bids, it is not a typical cloud or academic purchase.

Original Design Manufacturers (ODMs) like Quanta Cloud Technology (QCT) manufacture variants of OCP’s HGX-1 and Big Basin chassis, but do not publish pricing. NVIDIA P100 modules are priced from about $5,400 to $9,400 each. Because NVIDIA’s SXM2 GPUs account for most of the cost of both Big Basin and HGX-1, we believe that system pricing for both is in the range of $50,000 to $70,000 per chassis unit (not including matching x86 servers), in cloud-sized purchase quantities.

Facebook’s Big Basin Performance Claims

Facebook published a paper in June describing how it connected 32 Big Basin systems over its internal network to aggregate 256 GPUs and train a ResNet-50 image recognition model in under an hour with about 90% scaling efficiency and 72% accuracy.

While 90% scaling efficiency is an impressive achievement for state-of-the-art, there are several challenges with Facebook’s paper.

The eight-GPU Big Basin chassis is the largest possible scale-up NVIDIA NVLink instance. It is expensive, even if you could buy OCP gear as an enterprise buyer. Plus, Facebook’s paper does not mention which OCP server chassis design and processor model they used for their benchmarks. Which processor it used may be a moot point, because if you are not a cloud giant, it is very difficult to buy a Big Basin chassis or any of the other OCP servers that Facebook uses internally. Using different hardware, your mileage is guaranteed to vary.

Facebook also does not divulge the operating system or development tools used in the paper, because Facebook has its own internal cloud instances and development environments. No one else has access to them.

The net effect is that it is nearly impossible to replicate Facebook’s achievement if you are not Facebook.

TIRIAS Research

Facebook Big Basin Server

IBM Scales-Out with Four GPUs in a System

IBM recently published a paper as a follow-up to the Facebook paper. IBM’s paper describes how to train a Resnet-50 model in under an hour at 95% scaling efficiency and 75% accuracy, using the same data sets that Facebook used for training. IBM’s paper is notable in several ways:

  1. Not only did IBM beat Facebook on all the metrics, but 95% efficiency is very linear scaling.
  2. Anyone can buy the equipment and software to replicate IBM’s work. Equipment, operating systems and development environments are called out in the paper.
  3. IBM used smaller scale-out units than Facebook. Assuming Facebook used their standard dual-socket compute chassis, IBM has half the ratio of GPUs to CPUs – Facebook uses a 4:1 ratio and IBM uses a 2:1 ratio.

IBM sells its OpenPOWER “Minsky” deep learning reference design as the Power Systems S822LC for HPC. IBM’s PowerAI software platform with Distributed Deep Learning (DDL) libraries includes IBM-Caffe and “topology aware communication” libraries. PowerAI DDL is specific to OpenPOWER-based systems, so it will run on similar POWER8 Minsky-based designs and upcoming POWER9 “Zaius”-based systems (Zaius was designed by Google and Rackspace), such as those shown at various events by Wistron, E4, Inventec and Zoom.

PowerAI DDL enables creating large scale-out systems out of smaller, more affordable, GPU-based scale-up servers. It optimizes communications between GPU-based servers based on network topology, the capabilities of each network link, and the latencies for each phase of a DL model.

IBM used 64 Power System S822LC systems, each with four NVIDIA Tesla P100 SXM2-connected GPUs and two POWER8 processors, for a total of 256 GPUs – matching Facebook’s paper. Even with twice as many IBM GPU-accelerated chassis required to host the same number of GPUs as in Facebook’s system, IBM achieved a higher scaling efficiency than Facebook. That is no small feat.

TIRIAS Research

IBM Power System S822LC with two POWER8 processors (silver heat sinks) and four NVIDIA Tesla P100 SXM2 modules

Commercial availability of IBM’s S822LC for low volume buyers will be a key element enabling academic and enterprise researchers to buy a few systems and test IBM’s hardware and software scaling efficiencies. The base price for an IBM S822LC for Big Data (without GPUs) is $6,400, so the total price of a S822LC for High Performance Computing should be in the $30,000 to $50,000 ballpark (including the dual-processor POWER8 server node), depending on which P100 model is installed and other options.

Half the battle is knowing that something can be done. We believe IBM’s paper and product availability will spur a lot of DL development work by other hardware and software vendors.

— The author and members of the TIRIAS Research staff do not hold equity positions in any of the companies mentioned. TIRIAS Research tracks and consults for companies throughout the electronics ecosystem from semiconductors to systems and sensors to the cloud.

[“Source-forbes”]

Smart Master Data Management Will Power Your Customer Analytics And Insights

Shutterstock

It’s supposed to be a post omni-channel world. A time when personalization is the norm, but as customers we know this is not the case. When customers contact companies, the companies often don’t recognize who the customer is, and they certainly don’t know what the customer’s preferences are. The truth is customers rarely get the personalized and tailored experiences they desire.

Well it’s not a shortage of data about those customers.

So why is it that when customers contact companies, most companies seem completely unaware about anything specific regarding that customer, let alone tastes and preferences. The reason is delinquency when it comes to the aggregation and management of the customer’s data inside the company.

Companies have many challenges when it comes to using customer data, particularly data that is supposed to help the company make better decisions in real-time.

David Rowley, CTO of IAC Publishing Labs, and former executive at customer experience software vendor Sprinklr, said, “the notion of a centralized repository for key business data is an important aspect of providing a more comprehensive customer experience.”

But many companies don’t have this today.

This information is called master data. “Master data” can be about products, employees, materials, suppliers but also may include documents and sales. Rowley said, “A centralized master data store (the repository for the master data) can improve real time decision making and analytics, if information about the entities (e.g. customers) is stored centrally. This provides you with a single trustworthy source of truth about the customer.” Rowley added that you can report directly on that data, rather than having to aggregate customer data from separate systems. He said, “A central view of the customer, for example, keeps various systems in sync with what the customer has done.”

Vic Bhagat, CIO of Verizon Business Services, said, “master data is more important now than ever – the amount of data being generated today will pale in comparison to what will be generated in just a few more years.” He added, “Having a master data management strategy clearly defined — early on — enables a faster approach to analytics to deliver a more proactive, predictive, prescriptive outcome that customers expect today.”

As we established, companies struggle with their current state of their customer data management, but what happens when the business becomes a little complicated, such as with a merger or acquisition. How does the company merge all the new data with all the historical data? Master data management can also help with this.

We addressed the fact that many data systems that aren’t integrated. Sometimes data problems include the actual data sources, data duplicates, no data governance and no standards. Data is not always shared efficiently within the organization. Many of the world’s biggest companies operate like separate islands. Customer service and sales do not share a crm. The company does not collaborate around the customer to ensure a powerful customer experience. The company’s various departments operate in silos. More often than not the people working inside the companies do not even know one another in different departments, let alone use data that spans across the organization.

This is a challenge for organizations who need to create easy and elegant experiences for customers. Companies need to leverage data in real-time to make better decisions. Customer analytics are not helpful when one can’t trust the accuracy of the data. And with many companies today data is often too little too late.

“Leveraging Master data enables enterprises to clean, integrate, and supplement their data to ensure a complete 360 view of their customer. With this view, companies can make informed decisions, align different departments, and create world class online and offline customer experiences leading to sales growth,” says Rishi Dave, chief marketing officer at Dun and Bradstreet, a data and analytics company who provides master data.

End-to-end master data management helps clients make marketing campaigns 30% more efficient, improve upsell and cross-sell rates by 60% and increase loyalty members’ spending by 20%, according to Informatica.

Neopost Uses Master Data For A Transformation Initiative

Neopost, a market-leading global provider of mailing solutions, digital communications and shipping services, found that master data was critical for a transformation initiative. The company wanted a way to more competitively know and serve their customers. As Neopost focused on modernizing its offerings and delivery for the digital age, strong data management has played a key role.

“Besides developing the actual software we sell our customers, we’ve got to build an infrastructure where data quality is paramount,” says Steve Rakoczy, Neopost’s North America CIO. “You can make a mess really quickly in the electronic age if you don’t have your data right.” After using master data, Rakoczy said, “We can successfully manage our Salesforce environment for the first time in over a decade.”

Master data management can help your company create more compelling customer experiences, but first the company must decide on a strong data approach.

For more from author Blake Morgan sign up for her weekly customer experience newsletter here.

[“Source-ndtv”]

EcoFlow River is a mobile power plant for all your gear

Currently available on Indiegogo for $499 (roughly £390 or AU$675), the EcoFlow River mobile power station claims a huge capacity of 412Wh or 116,000mAh with a total output of 500 watts. Up front you’ll find spots for all your mobile charging needs: four USB ports (two with Quick Charge 2.0), two USB Type C with Quick Charge 3.0 and two DC outlets. Around back you’ll find two AC outlets and a 12-volt car port.

I had a chance to test an early sample and, while I did no formal battery testing, it seems to perform as promised, powering and charging up to 11 devices simultaneously for hours. It was a perfect companion for everything from car camping trips to home improvement work to keeping my drone and GoPro batteries charged up while I was out flying for the day.

That amount of power also meant I could go work on my laptop out in park for hours with my screen on full brightness without worrying about finding an outlet or my phone battery dying while I used it as a mobile hotspot. And while using it outdoors or on the go is a given, I also found myself using it around the house in lieu of an extension cord.

Also, because the thing weighs around 11 pounds (5 kg), it wasn’t too much of hassle to carry around and the display on front lets you know just how much power you’re using and the time remaining. What really sells the River, though, is that it’ll make a great emergency power source since it doesn’t trickle discharge as quickly as competing stations. EcoFlow claims it will keep a full charge for one year, so it’ll be ready whenever you need to use it.

When its batteries are drained a full charge off a wall outlet takes only 6 hours. If you’re traveling, a car lighter socket will get it fully charged in 9 hours. EcoFlow also offers an optional solar panel that will take the River from empty to full in 10 to 15 hours.

[Source”cnbc”]

Trump to review power to establish federal lands

Washington (CNN)President Donald Trump will order a review of the 1906 law that gives the president of the United States power to set aside lands for federal protection, administration officials tell CNN, setting into motion a process that could see the Trump administration rescind the protection of lands designated by former President Barack Obama.

Trump will sign the executive order Wednesday at the Interior Department, Secretary Ryan Zinke told reporters. The order could lead to the reshaping of roughly 30 national monuments that were designated by Presidents Bill Clinton, George W. Bush and Barack Obama after 1996.
At the heart of this proposal is Bears Ears National Monument, a 1.3-million-acre parcel of lands that includes world-class rock climbing, age-old cliff dwellings and land sacred to Pueblo Indians that Obama designated a monument in 2016.
Zinke said Tuesday in a briefing with reporters that he will make a recommendation on the contested parcel of land in 45 days and later provided Trump will a fuller report.
“We feel that the public, the people that monuments affect, should be considered and that is why the President is asking for a review of the monuments designated in the last 20 years,” Zinke said, adding that he believes the review is “long overdue.”
Utah Republicans, angry that Obama designated the land for federal protection, have called on the Trump administration to remove the protection and give the parcel back to the deep red state — possibly to authorize drilling. But that action has been met with vocal opposition from environmental groups, outdoor outfitters and Native American tribes, who argue federal protection is not only better for the environment, but better for the economy in a rural, economically depressed area of the Beehive State.
“The policy is consistent with the President Trump’s promise to give American’s a voice and make sure their voices are heard,” the interior secretary added, arguing that the order “restores the trust between local communities and Washington” and lets rural America know “states will have a voice” in land designation.
That argument is largely dismissed by the White House.
“Past administrations have overused this power and designated large swaths of land well beyond the areas in need of protection,” a White House official said Tuesday. “The Antiquities Act Executive Order directs the Department of the Interior to review prior monument designations and suggest legislative changes or modifications to the monument proclamations.”
The move by Trump will not resolve the Bears Ears issue. Instead, it will set up a process to review the designation and make a decision at a later date. But groups that support keeping Bears Ears in federal control believe the Trump administration’s decision, led by Zinke, is the first step in the process to give the land back to Utah.
Rose Marcario, president and CEO of the outdoor outfitter Patagonia, said the review “is an assault on America’s most treasured lands and oceans.”
“Bears Ears and other national monuments were designated after significant community input because they are a critical part of our national heritage and have exceptional ecological characteristics worth protecting for future generations,” Marcario said. “It’s extremely disturbing to see the Trump administration apparently laying the groundwork to remove protections on our public lands.”
Zinke said he is prepared for legal challenges from environmental groups — “I am not in fear of getting sued, I get sued all the time,” he said — but acknowledged that it is “untested” whether the President has the power to shrink public lands by using the Antiquities Act.
And there are likely to be legal challenges. Marcario told CNN Patagonia was “preparing to take every step necessary, including possible legal action” in order to protect Bears Ears and other national monuments.
Republicans in Utah, including Gov. Gary Herbert, have asked the Trump administration to rescind the National Monument status for Bear Ears, arguing the designation infringes on their state’s rights. Herbert signed a resolution in February that urged Trump to rescind Bear Ears’ status.
Led by Utah Reps. Jason Chaffetz and Chris Stewart in Washington, along with Sen. Orrin Hatch, Republicans are urging Congress to withhold money for the national monument in response to the designation.
Mining companies have also been eager for a decision. EOG Resources, a Texas-based company, was recently approved to drill near Bears Ears. And activists are worried that the area, which is rich in natural resources, could be offered up to oil companies if it is de-listed.
Hatch said in response to Trump’s forthcoming order that he is “committed to rolling back the egregious abuse of the Antiquities Act to serve far-left special interests.”
Hatch’s opponents argue that withholding funds or rescinding the Antiquities Act order would impact San Juan County, Utah, where more than 28% of the population lives below the poverty level.
The group Public Land Solutions, a pro-federal designation group, said in a recent report that the economic benefits of Bears Ears to the area should outweigh any benefits with mineral or oil extraction.
“Show me the money,” said Ashley Korenblat from Public Land Solutions. “We are confident that a fact-based review of the national parks and public lands protected as monuments by the Antiquities Act will show year-over-year economic growth.”
Bears Ears is not the only site that has experienced a push to give up federal protection.
Republicans in Maine, including Gov. Paul LePage, have asked Trump to stop national monument status for Katahdin Woods and Waters National Monument, an expansive piece of land that includes much of the Penobscot River watershed. Like Bears Ears, that parcel was designated a national monument by Obama in 2016.
The order will review any monument created between Grand Staircase Escalante in September 1996 to Bears Ears in 2016 that impact more than 100,000 acres or more. Under this designation, Katahdin Woods and Waters — an expansive piece of land in Maine that includes much of the Penobscot River watershed that was enacted by Obama in 2016 — would not be reviewed, despite the calls.
The monuments the Trump administration will review include Grand Canyon-Parashant National Monument, Grand Staircase-Escalante National Monument and Basin and Range National Monument, as well as a host of Pacific Ocean monuments, including the World War II Valor in the Pacific National Monument. All but one of the monuments set to be reviewed is west of the Mississippi.
Should Trump and his administration opt to de-list these sites, they would be going back on some off their promises to both voters and members of Congress who oversaw Zinke’s confirmation process.
“I don’t like the idea because I want to keep the lands great,” Trump said in January 2015 during an interview with Field & Stream when asked about transferring public lands to state control. Donald Trump, Jr. has also said he is in favor of “refunding” federal lands to keep them out of private control.
Those views aren’t in line with much Republican orthodoxy, which has long said the federal government should control less land, not more.
But Trump isn’t the only Republican who expressed this view: Zinke also told senators during his confirmation process that he was against giving public lands back to the states.
“I am absolutely against transfer and sale of public lands. I can’t be more clear,” he said when Sen. Maria Cantwell, a Washington Democrat, asked if, under Trump, federal land would be “unbelievable attack by those who would like to take these public lands away from us and turn them over back to states.”
Zinke stood by that statement Tuesday, arguing that it is wrong to suggest the review will lead to the transfer of public lands.
“I think that argument is false,” he said, blaming “modern media” for the polarized views on Bears Ears.
Cantwell said Tuesday that Trump’s decision to de-list would be “illegal” and faulted Zinke and others for doing the “bidding” for coal and natural resource companies.
[“Source-cnn”]