University of Queensland highlights potential of ultrasound to delay the onset of Alzheimer’s diseaseThe University of Queensland is modeling possible treatments for some of the most debilitating illnesses such as Alzheimer’s disease. The Queensland Brain Institute, the University’s neuroscience research institute, is using its high performance computing system to model the behavior of ultrasound using an analysis technique called Finite Element Method (FEM).The modeling calculates what happens to each element of the brain when an ultrasound is passed through the skull. It is hoped that ultrasound can be used to temporarily allow direct delivery of therapeutic drugs to the brain, something not currently possible due to the presence of a blood-brain barrier. The team also hopes to activate cells that can digest the plaques that are a hallmark of Alzheimer’s disease. The promising results will now be confirmed in a sheep study, an animal with similar skull properties as humans, and may be instrumental in developing treatments that stop or reserve degeneration, rather than just relieving symptoms.I am inspired! And it’s Dell Technologies customers like the University of Queensland who give me hope for our future.The University of Queensland (UQ) is consistently rated as one of Australia’s premier research-intensive universities by independent third parties. For example, UQ was awarded five out of five stars for research grants and research intensity by the Good Universities Guide. And the influential Academic Ranking of World Universities places UQ at 55th globally. To maintain these world-class rankings, UQ strives to empower its faculty with leading-edge high performance computing (HPC) systems. The latest example is the Wiener system, an HPC cluster designed in cooperation with Dell EMC to accelerate discovery and innovation.Niche supercomputing for cutting-edge microscopyThe university’s Research Computing Centre (RCC) is tasked with managing HPC resources. The RCC has developed a strategy for broadening the HPC infrastructure base, built on the theory that it’s more cost-effective to serve different application types with different machines.Launched in 2018, Wiener is the first ever dedicated GPU-accelerated supercomputer in an Australian university. It was designed specifically for imaging-intensive workloads generated by the University’s microscopy facilities and its world-class Lattice Light Sheet Microscope based at the University’s Institute of Molecular Bioscience. Wiener was developed and paid for with strategic funding from UQ and a consortium of the University’s various cutting-edge microscopy facilities.The RCC worked in close partnership with Dell Technologies to develop the system’s capabilities. This groundbreaking project was completed in two phases. The first used Dell EMC PowerEdge R740 Servers as building blocks for processing data sets across 15 compute and analysis nodes, along with two additional nodes for visualization. The visualization nodes were included to provide processing power for a fully interactive experience to researchers viewing 4D data sets in real time at the edge of the scientific instruments throughout the university campus.The second incorporated 15 Dell EMC PowerEdge C4140 Servers, each with two Intel® Xeon® Gold 6132 Scalable processors with 28 cores per node, 384GB of DDR4 RAM, four NVIDIA SXM2 Tesla 32GB V100 GPUs, 1.6TB of Dell EMC NVMe flash storage, and 100-Gbps Mellanox® EDR InfiniBand® networking.The complete system includes 32 Dell EMC PowerEdge servers and 527,360 NVIDIA® Tesla® GPU cores distributed across the compute and analysis nodes. UQ wanted a system with multiple GPUs for parallel processing massive amounts of computational tasks such as those involved in data visualization and machine learning.The supercomputer also leverages an open‑source, parallel‑cluster file system, designed specifically to manage I/O‑intensive workloads in performance‑critical environments. Both phases of the system are backed by Dell EMC ProSupport Plus for 24×7 technical support and assistance.Half a million GPU cores in actionThe Wiener supercomputer has a total performance of 11.3 petaFLOPS. Because the system has more than a half-million CUDA cores along with the parallel file system, it can easily handle multiple workloads, some of which generate 70–80 TBs of data per day. Researchers can process a single file at 105Gb/s, with up to 15 million IOPS.“For the scale of machine learning to take place with the CPUs we previously had on campus, in some cases what we have now is 100 times faster,” says Jake Carroll, chief technology officer at UQ. “It’s literally untenable to run on CPUs [alone].” He continues, “The machine has become a plethora of massive machine learning and deep learning capabilities in the organization. It’s the focal point of AI computing at the University of Queensland.”The cutting edge of accelerated technologyWith the accelerated computing power of the Wiener supercomputer, UQ researchers can solve pressing global challenges while helping maintain the school’s status as a top research university.“Given demand for the system is still going up, it is our intention to work with Dell Technologies to stay on the cutting edge of accelerated technology,” Carroll says, “and we would like to be the first in the Asia Pacific to deploy whatever technological leap in this area comes next.”To learn moreRead the case study, “Accelerating research breakthroughs.”Read the press release, “The University of Queensland Uses Dell Technologies Supercomputer to Pioneer New Alzheimer’s Disease Breakthrough.”Read the Weiner supercomputer technical specifications.Learn about technologies for HPC/AI at DellEMC.com/HPC.
The Edge, while frequently discussed as something new, is in fact another technical turn of the crank. Fueled by an abundance of smart devices and IoT sensors, worldwide data creation has been growing exponentially, driving our customers and partners to innovate. For example, between 2016 and 2018, there was an 878% growth in healthcare and life science data resulting in over 8 petabytes of data managed by providers per annum. Dell Technologies has been at the forefront of this data revolution enabling our customers and partners to leverage these new sources of data to drive business. The process of data creation, transformation and consumption has taken on new meaning as devices have become more integrated in our everyday lives. How this data lifecycle adds value to our customers and partners is the subject of our post today.Data Creation“Data is fuel.” We’ve heard this spoken time and time again. While that’s true – it doesn’t convey the process the data undergoes, before it become something useful. “Data is fuel” is the net result of this process, not the genesis.So, how do we get to this final, consumptive state with data? Data Creation is a constantly evolving mechanism driven by innovation, both in technology as well as in society. For example, the idea of remote patient monitoring has evolved, enabled by complementary technologies like 5G networks and IoT sensors. The ability for health care providers to securely retrieve data from smart watches, pacemakers, blood pressure cuffs, temperature sensors, electrocardiograms and insulin pumps (to name just a few) has driven a new paradigm of patient care and engagement. This wouldn’t have been possible a few decades in the past and, due to innovative approaches in networks, data management, and sensors, it represents one of many unique applications of the data creation process. Once this data is created, however, it must be transformed to be useful.Data TransformationUsing the example of remote patient monitoring, the data generated by various sensors is unique. It has no intrinsic value as a “raw” data stream. Binary bits of encoded data provide no context, no perspective on what is happening with a patient. To fully understand, contextualize and derive useful consumptive value, it must be transformed. This transformation process extracts information, correlates and curates it through applications like artificial intelligence and analytics and provides it back in a human and machine-readable format. 1’s and 0’s become more than their sum and now, as transformed data, they’re ready to be consumed. Continuing with the patient monitoring example, the doctor receiving this information is then able to correlate and analyze these data feeds from a variety of sensors and sources and view them with an eye toward application. Recently, a Dell Technologies customer was able to increase their analyst-to-support staff ratio by greater than 100:1, enabling them leverage this data transformation to achieve better performance. As we’ve now seen, data is now one step closer to being fuel.Data ConsumptionBy now, data has been created in various modalities, transformed by analytics and artificial intelligence and is ready to be consumed. Consuming data is more than just visualizing an output; it is the action. Our doctor has received remote patient data, securely viewed the correlated results and is now ready to provide diagnosis. The diagnosis is the net result of this generative (reproductive) model. Rather than being static or one-time-use, data consumption has taken on new meaning. Broadening this example, doctors use data to predict how to better counteract and treat disease. Machine learning models consume training data to learn to take future action and to create and transform the outputs into new capabilities. Manufacturers view vehicle data in extended reality (XR), peeling apart systems to be able to experience the real-time interactions between components. This generative cycle continues to evolve as technology advances, making the most of data’s kinetic energy.ConclusionThe Edge brings tremendous value to the creation, transformation and consumption model of data. Understanding where your organization’s data is on its journey at the Edge will enable you to make meaningful choices with the data at your disposal. From Dell Technologies Design Solutions to our modular data centers (MDCs), to our comprehensive portfolio of PowerEdge servers, storage and networking equipment, to ruggedized Dell gateways and laptops, you can be assured that as new technologies emerge, new modalities are created, you will be uniquely equipped to respond to the ever-changing landscape of the world economy.
“Massively disruptive,” “a game changer,” and “will redefine industries” are just some of the phrases used over the last year by market researchers to describe the potential of edge computing. According to Gartner, by the year 2025, 75% of all data will be processed at the edge.What implications does this have for companies designing solutions today? How have solution builders embraced edge computing now, and how are they shifting their business models to prepare for the expected influx of edge data processing needs in the near future?Massive opportunity with big dataFrom working with our OEM customers, I know that faster access to processed data is already opening new business models and driving innovation and opportunity across every industry. The rationale for this shift is clear. Increasingly, both OEMs and end users need to be able to process enormous amounts of data on site, giving them the power to make smarter decisions, faster. By embracing edge computing, these companies are seeing real results in reduced cost, latency and connectivity challenges for both themselves and their customers.Security solutions, AI platforms and billable servicesFor example, OEM customers such as BCDVideo are using edge technology to design security solutions for their end users, giving them the ability to process data locally instead of sending it across the network, where it’s more vulnerable. Others, like Kinetica, are designing AI platforms that can collect and analyze the volumes of edge data their customers need to process. OEMs are also developing new revenue streams, introducing new billable services to differentiate themselves from the competition and increase the stickiness of customer accounts.IT meets OT on the factory floorIt’s probably not a surprise to anyone familiar with the industrial industry, but we’re also seeing the Enterprise Edge evolve with platforms traditionally used in a datacenter now deployed on the factory floor. The production lines, where data is gathered, is also changing. In the past, edge devices would typically have been small PC-type computers. Now, we’re now seeing demand for ruggedized, long life systems that can run heavier workloads, like analytics, machine learning, vision systems and robotics.The result? OEMs are using edge technology to change how they are designing and manufacturing their own products, which in turn can positively impact their own customers in areas like cost savings and faster turnaround times.Managing IT infrastructure on the factory floorOEM customer, Rockwell Automation, is a great example. Rockwell serves a wide range of industries globally from wastewater and consumer packaged goods through to oil and gas. According to Rockwell (registration required), IT and OT convergence represents a challenge for many customers. Despite cyber security risks and the increased reliance on data and network infrastructure to enable manufacturing, there’s a shortage of skills and staff to manage IT infrastructure on the factory floor and optimize operations. Yet, ten minutes downtime can cost over $10k.In response, Rockwell worked with OEM Solutions to engineer a virtual, turnkey solution, designed on Dell PowerEdge XL servers for their customers, plus a range of managed services that help their customers meet their product schedules.The world’s first learning steel millIndustry leader, Big River Steel wanted to optimize operations and reduce waste at its $1.3 billion scrap metal recycling and steel production facility. Using edge compute, servers, storage plus an AI platform designed by our OEM customer, Noodle.ai, on Dell Technologies infrastructure, Big River Steel is now regarded as the world’s first learning steel mill, embedding AI and machine learning into its manufacturing processes.The result? Big River Steel has insights into all the factors responsible for quality variability and knows the optimal manufacturing input parameters. Importantly, the company has reported up to 50% reduction in steel yield strength variability and up to $10 million in savings. Market trendsI’ve shared some examples of edge computing but let’s look at the market research data. In a recent VDC survey of over 700 global product development decision makers, 42 percent are already deploying edge computing with 26% planning to do so within the next year. Interestingly, 36% say they’re investing in edge computing, specifically because of customer demand while around 20% of respondents say competitive pressure is driving this evolution.How does this translate into system deployments? Over 60% of respondents said they are now implementing additional remote monitoring and control within their industrial systems. Thirty-eight percent of organizations are enabling services to improve product lifecycle and inventory management, while 19 percent are introducing new, billable services.A world of possibilitiesWhat does this mean for the future? I believe the intelligent edge is the glue uniting IT and OT environments – the final piece of the puzzle, if you will. Over the next decade, I predict that every piece of industrial equipment will be fitted with an embedded or connected computer, allowing it to make intelligent decisions on its own. For our OEM customers, there’s huge opportunity to develop edge applications and enable software-defined functionality. We will see new, professional and billable services emerge with usage-based fees. Edge technologies will be all around us performing distributed computing across a multitude of devices in cities, factories, energy plants, and farms.Invest now in your future businessFrom an OEM and end customer perspective, increasing edge computing demands will accelerate the need for the right compute, connectivity and storage as well as more flexible hardware and software platforms. 2025 may still seem like the distant future, but the partnership and the technology choices you make now can and will impact the health of your future business.Give your business the extra edge – Dell Technologies OEM Solutions can help you design, deliver and support a wide range of customized industrial edge solutions.Listen to our recorded webinar, “Driving Industrial Innovation at the Edge” here. Registration required. Learn more about what we do in industrial automation at Dell Technologies OEM Solutions and read more about edge computing hereFollow us on Twitter @delltechdesignJoin our LinkedIn Dell Technologies OEM Solutions Showcase page.
LONDON (AP) — Authorities in Northern Ireland have suspended checks on animal products and withdrawn workers from two ports after threats against border staff. The Northern Ireland government said it stopped inspections at Belfast and Larne ports “in the interests of the wellbeing of staff.” Graffiti recently appeared in the Larne area that described port staff as “targets.” Britain’s departure from the EU has brought checks on some British goods going to Northern Ireland because it shares a border with EU member Ireland. Many in Northern Ireland’s pro-British Unionist community oppose the new rules. Police said evidence suggests the threats against border staff are the work of “a number of individuals and small groups.”