A Reboot for VMware?

August 31, 2016 | 244 views

What do you do when you are a company that once held a dominant position in a lucrative market like IT but the technology revolution you started has passed you by? If you’re VMware, you roll with the changes. It was basic server virtualization, after all, that led to the cloud, software-defined infrastructure, containers and the mobile, Big Data and IoT architectures they now support. But like IBM, Microsoft and countless other technology leaders before and since, the world doesn’t sit still just because an old problem has been solved – the focus shifts immediately to the new problems that the new working environment presents.

Spotlight

CloudLync TechSol pvt ltd

A Global Solution Integrator delivering technology solutions in Network Infrastructure, Unified Communications, Data Centre / Virtualization, Information Security and Enterprise Applications. CloudLync is a leader in Enterprise Communications in India and has significant presence in the Middle East, North America and Australia / New Zealand. In collaboration with global technology partners like Avaya, Cisco, HP, Juniper, Palo Alto, Microsoft, Netapp, Polycom etc.

OTHER ARTICLES
VIRTUAL DESKTOP TOOLS

Digital Marketplace: The Future of E-commerce

Article | June 24, 2022

It is no surprise that e-commerce has grown dramatically in recent years. I don't want to be boring, but certainly the pandemic and a few other market factors have had a role. From ancient times, marketplaces of all shapes and sizes have served as the foundation for all types of business. As the world transforms and becomes more digital, the rise of digital marketplaces, e-commerce, and other types of online business is exploding. E-commerce marketplace platforms are rapidly expanding in the digital environment and are expected to acquire momentum as the future of e-commerce. This increase is because of the fact that online marketplaces combine user demand and provide customers with a broader selection of products. Digital Marketplaces Are the Way to the Future of E-Commerce Without a doubt, online marketplaces will dominate the e-commerce business in the coming years. According to Coresight Research, marketplace platform revenue will more than double, reaching around $40 billion in 2022. This means that by 2022, online marketplaces will account for 67% of worldwide e-Commerce revenues (Forrester). Today, the issue is not whether you sell online but how far you can reach. E-commerce offers limitless opportunities, and all you need to do is keep pace with the trends. What are you doing right now? How far can you go? Have you already made the transition from local to global? Digital marketplaces are indeed the way of the future of e-commerce. The earlier you realize this and integrate it into your sales and marketing approach, the better. I really mean it. The world is changing, and your competitors are not sleeping. You cannot overlook this trend if you really want to stay ahead. It's all about the people in business, as it has always been. Understanding who you're pitching to is critical to your success. You should be aware. Everything you do in business should get you closer to your target audience. Closing Lines: Digital marketplaces are indeed the future of commerce. People will inevitably start shopping online even more in the future. That implies methods and means will be developed to make such transactions easier for the common individual. Explore how your business might profit from these markets and trends that suggest the future of physical and online shopping.

Read More
VIRTUAL DESKTOP TOOLS

Why Are Businesses Tilting Towards VDI for Remote Employees?

Article | July 7, 2022

Although remote working or working from home became popular during the COVID era, did you know that the technology that gives the best user experience (UX) for remote work was developed more than three decades ago? Citrix was founded in 1989 as one of the first software businesses to provide the ability to execute any program on any device over any connection. In 2006, VMware coined the term "virtual desktop infrastructure (VDI)" to designate their virtualization products. Many organizations created remote work arrangements in response to the COVID-19 pandemic, and the phenomenon will continue even in 2022. Organizations have used a variety of methods to facilitate remote work over the years. For businesses, VDI has been one of the most effective, allowing businesses to centralize their IT resources and give users remote access to a consolidated pool of computing capacity. Reasons Why Businesses Should Use VDI for their Remote Employees? Companies can find it difficult to scale their operations and grow while operating remotely. VDI, on the other hand, can assist in enhancing these efforts by eliminating some of the downsides of remote work. Device Agnostic As long as employees have sufficient internet connectivity, virtual desktops can accompany them across the world. They can use a tablet, phone, laptop, client side, or Mac to access the virtual desktop. Reduced Support Costs Since VDI setups can often be handled by a smaller IT workforce than traditional PC settings, support expenses automatically go down. Enhanced Security Data security is raised since data never leaves the datacenter. There's no need to be concerned about every hard disk in every computer containing sensitive data. Nothing is stored on the end machine while using the VDI workspace. It also safeguards intellectual property while dealing with contractors, partners, or a worldwide workforce. Comply with Regulations With virtual desktops, organizational data never leaves the data center. Remote employees that have regulatory duties to preserve client/patient data like function because there is no risk of data leaking out from a lost or stolen laptop or retired PC. Enhanced User Experience With a solid user experience (UX), employees can work from anywhere. They can connect to all of their business applications and tools from anywhere they want to call your workplace, exactly like sitting at their office desk, and even answer the phone if they really want to. Closing Lines One of COVID-19's lessons has been to be prepared for almost anything. IT leaders were probably not planning their investments with a pandemic in mind. Irrespective of how the pandemic plays out in the future, the rise of remote work is here to stay. If VDI at scale is to become a permanent feature of business IT strategies, now is the moment to assess where, when, and how your organization can implement the appropriate solutions. Moreover, businesses that use VDI could find that the added flexibility extends their computing refresh cycles.

Read More
VMWARE

How to Start Small and Grow Big with Data Virtualization

Article | December 14, 2021

Why Should Companies Care about Data Virtualization? Data is everywhere. With each passing day, companies generate more data than ever before, and what exactly can they do with all this data? Is it just a matter of storing it? Or should they manage and integrate their data from the various sources? How can they store, manage, integrate and utilize their data to gain information that is of critical value to their business? As they say, knowledge is power, but knowledge without action is useless. This is where the Denodo Platform comes in. The Denodo Platform gives companies the flexibility to evolve their data strategies, migrate to the cloud, or logically unify their data warehouses and data lakes, without affecting business. This powerful platform offers a variety of subscription options that can benefit companies immensely. For example, companies often start out with individual projects using a Denodo Professional subscription, but in a short period of time they end up adding more and more data sources and move on to other Denodo subscriptions such as Denodo Enterprise or Denodo Enterprise Plus. The upgrade process is very easy to establish; in fact, it can be done in less than a day once the cloud marketplace is chosen (Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP). In as little as six weeks companies can realize real business benefits from managing and utilizing their data effectively. A Bridging Layer Data virtualization has been around for quite some time now. Denodo’s founders, Angel Viña and Alberto Pan, have been involved in data virtualization from as far back as the 1990’s. If you’re not familiar with data virtualization, here is a quick summary. Data virtualization is the cornerstone to a logical data architecture, whether it be a logical data warehouse, logical data fabric, data mesh, or even a data hub. All of these architectures are best served by our principals Combine (bring together all your data sources), Connect (into a logical single view) and Consume (through standard connectors to your favorite BI/data science tools or through our easy-to-use robust API’s). Data virtualization is the bridge that joins multiple data sources to fuel analytics. It is also the logical data layer that effectively integrates data silos across disparate systems, manages unified data for centralized security, and delivers it to business users in real time. Economic Benefits in Less Than 6 weeks with Data Virtualization? In a short duration, how can companies benefit from choosing data virtualization as a data management solution? To answer this question, below are some very interesting KPI’s discussed in the recently released Forrester study on the Total Economic Impact of Data Virtualization. For example, companies that have implemented data virtualization have seen an 83% increase in business user productivity. Mainly this is due to the business-centric way a data virtualization platform is delivered. When you implement data virtualization, you provide business users with an easy to access democratized interface to their data needs. The second KPI to note is a 67% reduction in development resources. With data virtualization, you connect to the data, you do not copy it. This means once it is set up, there is a significant reduction in the need for data integration engineers, as data remains in the source location and is not copied around the enterprise. Finally, companies are reporting a 65% improvement in data access speeds above and beyond more traditional approaches such as extract, transform, and load (ETL) processes. A Modern Solution for an Age-Old Problem To understand how data virtualization can help elevate projects to an enterprise level, we can share a few use cases in which companies have leveraged data virtualization to solve their business problems across several different industries. For example, in finance and banking we often see use cases in which data virtualization can be used as a unifying platform to help improve compliance and reporting. In retail, we see use cases including predictive analytics in supply chains as well as next and best actions from a unified view of the customer. There are many uses for data virtualization in a wider variety of situations, such as in healthcare and government agencies. Companies use the Denodo Platform to help data scientists understand key trends and activities, both sociologically as well as economically. In a nutshell, if data exists in more than one source, then the Denodo Platform acts as the unifying platform that connects, combines and allows users to consume the data in a timely, cost-effective manner.

Read More
VIRTUAL DESKTOP STRATEGIES

Addressing Multi-Cloud Complexity with VMware Tanzu

Article | June 7, 2022

Introduction With cloud computing on the path to becoming the mother of all transformations, particularly in IT's ways of development and operations, we are once again confronted with the problem of conversion errors, this time a hundredfold higher than previous moves to dispersed computing and the web. While the issue is evident, the remedies are not so obvious. Cloud complexity is the outcome of the fast acceleration of cloud migration and net-new innovation without consideration of the complexity this introduces in operations. Almost all businesses are already working in a multi-cloud or hybrid-cloud environment. According to an IDC report, 93% of enterprises utilize multiple clouds. The decision could have stemmed from a desire to save money and avoid vendor lock-in, increase resilience, or businesses might have found themselves with several clouds as a result of the compounding activities of different teams. When it comes to strategic technology choices, relatively few businesses begin by asking, "How can we secure and control our technology?" Must-Follow Methods for Multi-Cloud and Hybrid Cloud Success Data Analysis at Any Size, from Any Source: To proactively recognize, warn, and guide investigations, teams should be able to utilize all data throughout the cloud and on-premises. Insights in Real-Time: Considering the temporary nature of containerized operations and functions as a service, businesses cannot wait minutes to determine whether they are experiencing infrastructure difficulties. Only a scalable streaming architecture can ingest, analyze, and alert rapidly enough to discover and investigate problems before they have a major impact on consumers. Analytics That Enables Teams to Act: Because multi-cloud and hybrid-cloud strategies do not belong in a single team, businesses must be able to evaluate data inside and across teams in order to make decisions and take action swiftly. How Can VMware Help in Solving Multi-Cloud and Hybrid-Cloud Complexity? VMware made several announcements indicating a new strategy focused on modern applications. Their approach focuses on two VMware products: vSphere with Kubernetes and Tanzu. Since then, much has been said about VMware's modern app approach, and several products have launched. Let's focus on VMware Tanzu. VMware Tanzu Tanzu is a product that enables organizations to upgrade both their apps and the infrastructure that supports them. In the same way that VMware wants vRealize to be known for cloud management and automation, Tanzu wants to be known for modern business applications. Tanzu uses Kubernetes to build and manage modern applications. In Tanzu, there is just one development environment and one deployment process. VMware Tanzu is compatible with both private and public cloud infrastructures. Closing Lines The important point is that the Tanzu portfolio offers a great deal of flexibility in terms of where applications operate and how they are controlled. We observe an increase in demand for operating an application on any cloud, and how VMware Tanzu assists us in streamlining the multi-cloud operation for MLOps pipeline. Apart from multi-cloud operation, it is critical to monitor and alarm each component throughout the MLOps lifecycle, from Kubernetes pods and inference services to data and model performance.

Read More

Spotlight

CloudLync TechSol pvt ltd

A Global Solution Integrator delivering technology solutions in Network Infrastructure, Unified Communications, Data Centre / Virtualization, Information Security and Enterprise Applications. CloudLync is a leader in Enterprise Communications in India and has significant presence in the Middle East, North America and Australia / New Zealand. In collaboration with global technology partners like Avaya, Cisco, HP, Juniper, Palo Alto, Microsoft, Netapp, Polycom etc.

Related News

VIRTUAL SERVER MANAGEMENT

AWS Announces General Availability of Amazon EC2 C7g Instances Powered by AWS-designed Graviton3 Processors

Amazon Web Services | May 24, 2022

Amazon Web Services, Inc. (AWS), an Amazon.com, Inc. company today announced the general availability of Amazon Elastic Compute Cloud (Amazon EC2) C7g instances, the next generation of compute-optimized instances powered by AWS-designed Graviton3 processors. New C7g instances use AWS Graviton3 processors to provide up to 25% better compute performance for compute-intensive applications than current generation C6g instances powered by AWS Graviton2 processors. The higher performance of C7g instances makes it possible for customers to run more efficiently a wide range of compute-intensive workloads—from web servers, load balancers, and batch processing to electronic design automation (EDA), high performance computing (HPC), gaming, video encoding, scientific modeling, distributed analytics, machine learning inference, and ad serving. There are no minimum commitments or upfront fees to use C7g instances, and customers pay only for the amount of compute used. Since launching in 2020, Amazon EC2 instances powered by AWS-designed Graviton2 processors have provided customers with significant performance improvements and cost savings for a broad range of applications. Today, 48 of the top 50 Amazon EC2 customers use AWS Graviton2-based instances to deliver superior price performance to their customers. Customers like DirecTV, Discovery, Epic Games, Formula 1, Honeycomb.io, Intuit, Lyft, Mercardo Libre, NextRoll, Nielsen, SmugMug, Snap, Splunk, and Sprinklr have seen significant performance gains, with reduced costs, running AWS Graviton2-based instances in production. The AWS Graviton-based instance portfolio offers 13 different instances that include general purpose, compute optimized, memory optimized, storage optimized, burstable, and accelerated computing instances, so customers have the deepest and broadest choice of high-performance, cost-effective, and power-efficient compute in the cloud for all sorts of applications. As customers bring more compute-intensive workloads to the cloud to transform their organizations and fuel new opportunities, they want even better price performance and greater energy efficiency when running these demanding workloads. To provide even better price performance for a wide variety of customer applications, new C7g instances powered by next generation AWS Graviton3 processors provide up to 25% better performance for compute-intensive applications over current generation C6g instances. Compared to previous generation AWS Graviton2 processors, AWS Graviton3 processors deliver up to 2x faster performance for cryptographic workloads, up to 3x faster performance for machine learning inference, and nearly 2x higher floating point performance for scientific, machine learning, and media encoding workloads. AWS Graviton3 processors are also more energy efficient, using up to 60% less energy for the same performance than comparable EC2 instances. C7g instances are the first in the cloud to feature the latest DDR5 memory, which provides 50% higher memory bandwidth than AWS Graviton2-based instances to improve the performance of memory-intensive scientific applications like computational fluid dynamics, geoscientific simulations, and seismic processing. C7g instances also deliver 20% higher networking bandwidth than C6g instances for network intensive applications like network load balancing and data analytics. “Customers of all sizes are seeing significant performance gains and cost savings using AWS Graviton-based instances. Since we own the end-to-end chip development process, we’re able to innovate and deliver new instances to customers faster. With up to 25% better performance than current generation Graviton instances, new C7g instances powered by AWS Graviton3 processors make it easy for organizations to get the most value from running their infrastructure on AWS.” David Brown, Vice President of Amazon EC2 at AWS New C7g instances are built on the AWS Nitro System, a collection of AWS-designed hardware and software innovations that streamline the delivery of isolated multi-tenancy, private networking, and fast local storage. The AWS Nitro System offloads the CPU virtualization, storage, and networking functions to dedicated hardware and software, delivering performance that is nearly indistinguishable from bare metal. For customers looking to enhance the performance of applications that require parallel processing like HPC and video encoding, C7g instances in the coming weeks will include support for Elastic Fabric Adapter (EFA), which allows applications to communicate directly with network interface cards, providing lower and more consistent latency. C7g instances are available for purchase as On-Demand Instances, with Savings Plans, as Reserved Instances, or as Spot Instances. C7g instances are available today in US East (N. Virginia) and US West (Oregon), with availability in additional AWS Regions coming later this year. Snap Inc. is a camera company focused on empowering people to express themselves, live in the moment, learn about the world, and have fun together. “We trialed the new AWS Graviton3-based Amazon EC2 C7g instances and found that they provide significant performance improvements on real workloads compared to previous generation C6g instances,” said Aaron Sheldon, Software Engineer at Snap. “We are excited to migrate our Graviton2-based workloads to Graviton3, including the messaging, storage and the friend graph workloads.” Sprinklr helps the world's biggest companies make their customers happier across 30+ digital channels—using the most advanced, sophisticated AI engine built for the enterprise to create insight-driven strategies and better customer experiences. “We run a wide variety of workloads on AWS Graviton-based instances for their significant price performance benefits,” said Jamal Mazhar, Vice President of Infrastructure and DevOps at Sprinklr. “After the announcement of AWS Graviton3, we benchmarked our workloads on the new Amazon EC2 C7g instances and observed 27% better performance compared to the previous generation instances. Based on these results, we are looking forward to adopting AWS Graviton3-based instances in production.” NextRoll, Inc. is a marketing and data technology company with a mission to accelerate growth for companies, big and small. Powered by machine learning, NextRoll’s technology gathers data, delivers reliable insights, and provides business with approachable tools to target buyers in strategic ways – all on one platform. “We have found that AWS Graviton3-based C7g instances are ideal for bidders, ad servers, and ElastiCache clusters,” said Valentino Volonghi, CTO at NextRoll. “We are seeing about 15% more requests handled by C7g instances compared to AWS Graviton2-based C6g instances. With C7g instances, we also observed up to 40% better latency. Based on these findings, we are looking forward to adopting AWS Graviton3-based C7g instances in production.” Ansys is a global leader in engineering simulation. “As engineers and designers face increasingly complex problems, cloud computing helps lower the barrier of access to high-performance computing, allowing users to solve problems faster,” said Prith Banerjee, Chief Technology Officer at Ansys. “Ansys has also been focusing on green computing initiatives with the goal of improving energy efficiency and reducing costs to customers. With the support of LS-DYNA on the AWS Graviton3 processor powered by AWS, Ansys customers will get the best of both worlds – access to a world-class multiphysics solver without compromising on speed, and lower energy and costs.” Beamr is a leading provider of image and video optimization solutions that enable professional photographers to improve their workflows, photo sharing services to improve user experience (UX) and reduce churn, and video service providers to reduce storage and delivery costs. “Beamr's JPEGmini software, written in C/C++, optimizes JPEG image files by reducing their file size without compromising quality. The application is compute-intensive and includes functions for image decoding, image encoding, and a quality measure algorithm that analyzes various image attributes,” said Dan Julius, Vice President of R&D at Beamr. “Since the mobile version of this software runs on Arm processors, we decided to test its performance on AWS Graviton3-based Amazon EC2 C7g instances. Rebuilding our software to run on C7g instances took us one working day, and the results were promising. When running on C7g instances, we saw 30% improved performance over comparable x86-based instances. Based on these results, we plan to recommend to our customers to run the Beamr JPEGmini software on Graviton3-based instances once those become GA, and we plan to benchmark Beamr’s H.264 and HEVC video encoders on Graviton instances as well.” About Amazon Web Services For over 15 years, Amazon Web Services has been the world’s most comprehensive and broadly adopted cloud offering. AWS has been continually expanding its services to support virtually any cloud workload, and it now has more than 200 fully featured services for compute, storage, databases, networking, analytics, machine learning and artificial intelligence (AI), Internet of Things (IoT), mobile, security, hybrid, virtual and augmented reality (VR and AR), media, and application development, deployment, and management from 84 Availability Zones within 26 geographic regions, with announced plans for 24 more Availability Zones and eight more AWS Regions in Australia, Canada, India, Israel, New Zealand, Spain, Switzerland, and the United Arab Emirates. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—trust AWS to power their infrastructure, become more agile, and lower costs.

Read More

VMWARE

Broadcom in Talks to Acquire Cloud Company VMware

VMware | May 23, 2022

Broadcom Inc. is in talks to acquire VMware Inc., the cloud-computing company backed by billionaire Michael Dell, according to people familiar with the matter, setting up a blockbuster tech deal that would vault the chipmaker into a highly specialized area of software. The discussions are ongoing and there’s no guarantee they will lead to a purchase, said the people, who asked not to be identified because the matter isn’t public. VMware currently has a market valuation of about $40 billion. Assuming a typical premium, the potential deal price would be higher, though the terms under consideration couldn’t be learned. Shares in VMware rose 15% in premarket trading on Monday, which would give the company a market value of about $46 billion. Broadcom, which has a valuation of about $222 billion, fell 2.4%. The transaction would extend a run of acquisitions for Broadcom Chief Executive Officer Hock Tan, who has built one of the largest and most diversified companies in the chip industry. Software has been a key focus in recent years, with Broadcom buying CA Technologies in 2018 and Symantec Corp.’s enterprise security business in 2019. A representative for VMware declined to comment. A representative for Broadcom wasn’t available for comment. “Investors have been increasingly focused on Broadcom’s appetite for another strategic or platform enterprise software acquisition—especially given the recent compression in software valuation, “ Wells Fargo analysts wrote after Bloomberg News’s report. “An acquisition of VMware would be considered as making strategic sense; consistent with Broadcom’s focus on building out a deepening enterprise infrastructure software strategy.” Broadcom makes a wide range of electronics, with its products going into everything from the iPhone to industrial equipment. But data centers have become a critical source of growth, and bulking up on software gives the company more ways to target that market. Broadcom was previously in talks to acquire SAS Institute Inc., a closely held software company valued at $15 billion to $20 billion. But those discussions ended last year without a deal.

Read More

SERVER VIRTUALIZATION

Leading Global Logistics Company Completes Milestone in Data Platform Modernization with Datometry

Datometry, | May 21, 2022

Datometry, the pioneer of database virtualization, announced today that a leading integrated container logistics company has completed a critical milestone of its data platform modernization with Datometry Hyper-Q. Using the Datometry Hyper-Q virtualization platform – the first to make existing applications fully interoperable with cloud databases without disrupting existing business processes – the company migrated from their on-premises system to Microsoft Azure Synapse in record time. The logistics company, ranked within the Forbes Global 2000, sought to migrate from its legacy, on-premises data warehouse – which was known for being one of the most complex and sophisticated installations of its kind – to a modern cloud data warehouse (CDW) that is cost effective, highly scalable, and supports the flexibility and speed demanded by its customers' supply chains. The company needed a solution that would enable it to maintain its existing high-volume ETL processes and simultaneously serve a large user community of business analysts and data scientists. Datometry Hyper-Q uniquely addressed its customer's business objectives, enabling the logistics leader to transfer its existing applications natively to Azure Synapse without costly rewrites of SQL code, at a fraction of the time and risk associated with typical database migrations. The customer saved tens of millions by migrating with Datometry, and its new, fully managed CDW is much more cost-effective to operate than its legacy database. The customer was able to preserve its long-standing investments in ETL, analytics, reporting and BI entirely. As a global pioneer in ocean and inland shipping, our customer – arguably the worldwide leader in logistics - has long been on the forefront of organizations' ability to meet the needs of businesses - and those businesses' customers - worldwide. An enterprise the size and scope of this customer migrating to the cloud with Datometry and Microsoft so quickly, without the cost and risk of a typical migration, demonstrates how database virtualization can unlock the benefits of the cloud for any organization." Mike Waas, CEO, Datometry. Datometry's customer considered a conventional database migration at first, but decided on Datometry for its digital transformation upon determining a conventional migration would take at least five years, cost several tens of millions of dollars, yet present only a 20% chance of success. Datometry Hyper-Q is used by leading Fortune 500 and Global 2000 enterprises to accelerate cloud modernization and move workloads between data warehouses. The Datometry Hyper-Q virtualization platform eliminates risk-laden, expensive, and time-consuming application rewrites. About Datometry Datometry is the global leader in database system virtualization. Datometry empowers enterprises to run their existing applications directly on the cloud database of their choice without the business disruption of costly and risk-laden database migrations and application rewrites. Leading Fortune 500 and Global 2000 enterprises worldwide realize significant cost savings and consistently outpace their competition by using Datometry during this critical period of transformation to cloud-native data management.

Read More

VIRTUAL SERVER MANAGEMENT

AWS Announces General Availability of Amazon EC2 C7g Instances Powered by AWS-designed Graviton3 Processors

Amazon Web Services | May 24, 2022

Amazon Web Services, Inc. (AWS), an Amazon.com, Inc. company today announced the general availability of Amazon Elastic Compute Cloud (Amazon EC2) C7g instances, the next generation of compute-optimized instances powered by AWS-designed Graviton3 processors. New C7g instances use AWS Graviton3 processors to provide up to 25% better compute performance for compute-intensive applications than current generation C6g instances powered by AWS Graviton2 processors. The higher performance of C7g instances makes it possible for customers to run more efficiently a wide range of compute-intensive workloads—from web servers, load balancers, and batch processing to electronic design automation (EDA), high performance computing (HPC), gaming, video encoding, scientific modeling, distributed analytics, machine learning inference, and ad serving. There are no minimum commitments or upfront fees to use C7g instances, and customers pay only for the amount of compute used. Since launching in 2020, Amazon EC2 instances powered by AWS-designed Graviton2 processors have provided customers with significant performance improvements and cost savings for a broad range of applications. Today, 48 of the top 50 Amazon EC2 customers use AWS Graviton2-based instances to deliver superior price performance to their customers. Customers like DirecTV, Discovery, Epic Games, Formula 1, Honeycomb.io, Intuit, Lyft, Mercardo Libre, NextRoll, Nielsen, SmugMug, Snap, Splunk, and Sprinklr have seen significant performance gains, with reduced costs, running AWS Graviton2-based instances in production. The AWS Graviton-based instance portfolio offers 13 different instances that include general purpose, compute optimized, memory optimized, storage optimized, burstable, and accelerated computing instances, so customers have the deepest and broadest choice of high-performance, cost-effective, and power-efficient compute in the cloud for all sorts of applications. As customers bring more compute-intensive workloads to the cloud to transform their organizations and fuel new opportunities, they want even better price performance and greater energy efficiency when running these demanding workloads. To provide even better price performance for a wide variety of customer applications, new C7g instances powered by next generation AWS Graviton3 processors provide up to 25% better performance for compute-intensive applications over current generation C6g instances. Compared to previous generation AWS Graviton2 processors, AWS Graviton3 processors deliver up to 2x faster performance for cryptographic workloads, up to 3x faster performance for machine learning inference, and nearly 2x higher floating point performance for scientific, machine learning, and media encoding workloads. AWS Graviton3 processors are also more energy efficient, using up to 60% less energy for the same performance than comparable EC2 instances. C7g instances are the first in the cloud to feature the latest DDR5 memory, which provides 50% higher memory bandwidth than AWS Graviton2-based instances to improve the performance of memory-intensive scientific applications like computational fluid dynamics, geoscientific simulations, and seismic processing. C7g instances also deliver 20% higher networking bandwidth than C6g instances for network intensive applications like network load balancing and data analytics. “Customers of all sizes are seeing significant performance gains and cost savings using AWS Graviton-based instances. Since we own the end-to-end chip development process, we’re able to innovate and deliver new instances to customers faster. With up to 25% better performance than current generation Graviton instances, new C7g instances powered by AWS Graviton3 processors make it easy for organizations to get the most value from running their infrastructure on AWS.” David Brown, Vice President of Amazon EC2 at AWS New C7g instances are built on the AWS Nitro System, a collection of AWS-designed hardware and software innovations that streamline the delivery of isolated multi-tenancy, private networking, and fast local storage. The AWS Nitro System offloads the CPU virtualization, storage, and networking functions to dedicated hardware and software, delivering performance that is nearly indistinguishable from bare metal. For customers looking to enhance the performance of applications that require parallel processing like HPC and video encoding, C7g instances in the coming weeks will include support for Elastic Fabric Adapter (EFA), which allows applications to communicate directly with network interface cards, providing lower and more consistent latency. C7g instances are available for purchase as On-Demand Instances, with Savings Plans, as Reserved Instances, or as Spot Instances. C7g instances are available today in US East (N. Virginia) and US West (Oregon), with availability in additional AWS Regions coming later this year. Snap Inc. is a camera company focused on empowering people to express themselves, live in the moment, learn about the world, and have fun together. “We trialed the new AWS Graviton3-based Amazon EC2 C7g instances and found that they provide significant performance improvements on real workloads compared to previous generation C6g instances,” said Aaron Sheldon, Software Engineer at Snap. “We are excited to migrate our Graviton2-based workloads to Graviton3, including the messaging, storage and the friend graph workloads.” Sprinklr helps the world's biggest companies make their customers happier across 30+ digital channels—using the most advanced, sophisticated AI engine built for the enterprise to create insight-driven strategies and better customer experiences. “We run a wide variety of workloads on AWS Graviton-based instances for their significant price performance benefits,” said Jamal Mazhar, Vice President of Infrastructure and DevOps at Sprinklr. “After the announcement of AWS Graviton3, we benchmarked our workloads on the new Amazon EC2 C7g instances and observed 27% better performance compared to the previous generation instances. Based on these results, we are looking forward to adopting AWS Graviton3-based instances in production.” NextRoll, Inc. is a marketing and data technology company with a mission to accelerate growth for companies, big and small. Powered by machine learning, NextRoll’s technology gathers data, delivers reliable insights, and provides business with approachable tools to target buyers in strategic ways – all on one platform. “We have found that AWS Graviton3-based C7g instances are ideal for bidders, ad servers, and ElastiCache clusters,” said Valentino Volonghi, CTO at NextRoll. “We are seeing about 15% more requests handled by C7g instances compared to AWS Graviton2-based C6g instances. With C7g instances, we also observed up to 40% better latency. Based on these findings, we are looking forward to adopting AWS Graviton3-based C7g instances in production.” Ansys is a global leader in engineering simulation. “As engineers and designers face increasingly complex problems, cloud computing helps lower the barrier of access to high-performance computing, allowing users to solve problems faster,” said Prith Banerjee, Chief Technology Officer at Ansys. “Ansys has also been focusing on green computing initiatives with the goal of improving energy efficiency and reducing costs to customers. With the support of LS-DYNA on the AWS Graviton3 processor powered by AWS, Ansys customers will get the best of both worlds – access to a world-class multiphysics solver without compromising on speed, and lower energy and costs.” Beamr is a leading provider of image and video optimization solutions that enable professional photographers to improve their workflows, photo sharing services to improve user experience (UX) and reduce churn, and video service providers to reduce storage and delivery costs. “Beamr's JPEGmini software, written in C/C++, optimizes JPEG image files by reducing their file size without compromising quality. The application is compute-intensive and includes functions for image decoding, image encoding, and a quality measure algorithm that analyzes various image attributes,” said Dan Julius, Vice President of R&D at Beamr. “Since the mobile version of this software runs on Arm processors, we decided to test its performance on AWS Graviton3-based Amazon EC2 C7g instances. Rebuilding our software to run on C7g instances took us one working day, and the results were promising. When running on C7g instances, we saw 30% improved performance over comparable x86-based instances. Based on these results, we plan to recommend to our customers to run the Beamr JPEGmini software on Graviton3-based instances once those become GA, and we plan to benchmark Beamr’s H.264 and HEVC video encoders on Graviton instances as well.” About Amazon Web Services For over 15 years, Amazon Web Services has been the world’s most comprehensive and broadly adopted cloud offering. AWS has been continually expanding its services to support virtually any cloud workload, and it now has more than 200 fully featured services for compute, storage, databases, networking, analytics, machine learning and artificial intelligence (AI), Internet of Things (IoT), mobile, security, hybrid, virtual and augmented reality (VR and AR), media, and application development, deployment, and management from 84 Availability Zones within 26 geographic regions, with announced plans for 24 more Availability Zones and eight more AWS Regions in Australia, Canada, India, Israel, New Zealand, Spain, Switzerland, and the United Arab Emirates. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—trust AWS to power their infrastructure, become more agile, and lower costs.

Read More

VMWARE

Broadcom in Talks to Acquire Cloud Company VMware

VMware | May 23, 2022

Broadcom Inc. is in talks to acquire VMware Inc., the cloud-computing company backed by billionaire Michael Dell, according to people familiar with the matter, setting up a blockbuster tech deal that would vault the chipmaker into a highly specialized area of software. The discussions are ongoing and there’s no guarantee they will lead to a purchase, said the people, who asked not to be identified because the matter isn’t public. VMware currently has a market valuation of about $40 billion. Assuming a typical premium, the potential deal price would be higher, though the terms under consideration couldn’t be learned. Shares in VMware rose 15% in premarket trading on Monday, which would give the company a market value of about $46 billion. Broadcom, which has a valuation of about $222 billion, fell 2.4%. The transaction would extend a run of acquisitions for Broadcom Chief Executive Officer Hock Tan, who has built one of the largest and most diversified companies in the chip industry. Software has been a key focus in recent years, with Broadcom buying CA Technologies in 2018 and Symantec Corp.’s enterprise security business in 2019. A representative for VMware declined to comment. A representative for Broadcom wasn’t available for comment. “Investors have been increasingly focused on Broadcom’s appetite for another strategic or platform enterprise software acquisition—especially given the recent compression in software valuation, “ Wells Fargo analysts wrote after Bloomberg News’s report. “An acquisition of VMware would be considered as making strategic sense; consistent with Broadcom’s focus on building out a deepening enterprise infrastructure software strategy.” Broadcom makes a wide range of electronics, with its products going into everything from the iPhone to industrial equipment. But data centers have become a critical source of growth, and bulking up on software gives the company more ways to target that market. Broadcom was previously in talks to acquire SAS Institute Inc., a closely held software company valued at $15 billion to $20 billion. But those discussions ended last year without a deal.

Read More

SERVER VIRTUALIZATION

Leading Global Logistics Company Completes Milestone in Data Platform Modernization with Datometry

Datometry, | May 21, 2022

Datometry, the pioneer of database virtualization, announced today that a leading integrated container logistics company has completed a critical milestone of its data platform modernization with Datometry Hyper-Q. Using the Datometry Hyper-Q virtualization platform – the first to make existing applications fully interoperable with cloud databases without disrupting existing business processes – the company migrated from their on-premises system to Microsoft Azure Synapse in record time. The logistics company, ranked within the Forbes Global 2000, sought to migrate from its legacy, on-premises data warehouse – which was known for being one of the most complex and sophisticated installations of its kind – to a modern cloud data warehouse (CDW) that is cost effective, highly scalable, and supports the flexibility and speed demanded by its customers' supply chains. The company needed a solution that would enable it to maintain its existing high-volume ETL processes and simultaneously serve a large user community of business analysts and data scientists. Datometry Hyper-Q uniquely addressed its customer's business objectives, enabling the logistics leader to transfer its existing applications natively to Azure Synapse without costly rewrites of SQL code, at a fraction of the time and risk associated with typical database migrations. The customer saved tens of millions by migrating with Datometry, and its new, fully managed CDW is much more cost-effective to operate than its legacy database. The customer was able to preserve its long-standing investments in ETL, analytics, reporting and BI entirely. As a global pioneer in ocean and inland shipping, our customer – arguably the worldwide leader in logistics - has long been on the forefront of organizations' ability to meet the needs of businesses - and those businesses' customers - worldwide. An enterprise the size and scope of this customer migrating to the cloud with Datometry and Microsoft so quickly, without the cost and risk of a typical migration, demonstrates how database virtualization can unlock the benefits of the cloud for any organization." Mike Waas, CEO, Datometry. Datometry's customer considered a conventional database migration at first, but decided on Datometry for its digital transformation upon determining a conventional migration would take at least five years, cost several tens of millions of dollars, yet present only a 20% chance of success. Datometry Hyper-Q is used by leading Fortune 500 and Global 2000 enterprises to accelerate cloud modernization and move workloads between data warehouses. The Datometry Hyper-Q virtualization platform eliminates risk-laden, expensive, and time-consuming application rewrites. About Datometry Datometry is the global leader in database system virtualization. Datometry empowers enterprises to run their existing applications directly on the cloud database of their choice without the business disruption of costly and risk-laden database migrations and application rewrites. Leading Fortune 500 and Global 2000 enterprises worldwide realize significant cost savings and consistently outpace their competition by using Datometry during this critical period of transformation to cloud-native data management.

Read More

Events