BRINGING VIRTUAL REALITY TO THE WEB PLATFORM

The world of Virtual Reality in 2016 feels a lot like the world of Mobile in 2007: a lot of expectations around a new way to interact with users, a lot of innovations in hardware and software to make it a reality, and a lot of unknowns as to how these interactions would develop.Having played a tiny role in making the Web a better platform for mobile devices, and with the feeling that Progressive Web Apps are finally bringing us where we need to be in that space, I have been looking in the past few months at where the Web needs to be to provide at least one of the major platform for Virtual Reality experiences.

Spotlight

Searchlight Cyber

We provide organizations with relevant and actionable dark web threat intelligence to help them identify and prevent criminal activity. Founded in 2017 with a mission to stop criminals acting with impunity on the dark web, we have been involved in some of the world’s largest dark web investigations and have the most comprehensive dataset based on proprietary techniques and ground-breaking academic research.

OTHER ARTICLES
Server Hypervisors

Evaluating the Impact of Application Virtualization

Article | September 9, 2022

The emergence of the notion of virtualization in today's digital world has turned the tables. It has assisted the sector in increasing production and making every activity easy and effective. One of the most remarkable innovations is the virtualization of applications, which allows users to access and utilize applications even if they are not installed on the system on which they are working. As a result, the cost of obtaining software and installing it on specific devices is reduced. Application virtualization is a technique that separates an application from the operating system on which it runs. It provides access to a program without requiring it to be installed on the target device. The program functions and interacts with the user as if it were native to the device. The program window can be resized, moved, or minimized, and the user can utilize normal keyboard and mouse movements. There might be minor differences from time to time, but the user gets a seamless experience. Let’s have a look at the ways in which application virtualization helps businesses. The Impact of Application Virtualization • Remote-Safe Approach Application virtualization enables remote access to essential programs from any end device in a safe and secure manner. With remote work culture developing as an increasingly successful global work paradigm, the majority of businesses have adapted to remote work-from-home practice. This state-of-the-art technology is the best option for remote working environments because it combines security and convenience of access. • Expenditure Limitations If you have a large end-user base that is always growing, acquiring and operating separate expensive devices for each individual user would definitely exhaust your budget. In such situations, virtualization will undoubtedly come in handy because it has the potential to offer all necessary applications to any target device. • Rolling Out Cloud Applications Application virtualization can aid in the development and execution of a sophisticated and controlled strategy to manage and assure a seamless cloud transition of an application that is presently used as an on-premise version in portions of the same enterprise. In such cases, it is vital to guarantee that the application continues to work properly while being rolled out to cloud locations. You can assure maximum continuity and little impact on your end customers by adopting a cutting-edge virtualization platform. These platforms will help to ensure that both the on-premise and cloud versions of the application are delivered smoothly to diverse groups sitting inside the same workspace. • Implementation of In-House Applications Another prominent case in which virtualization might be beneficial is the deployment and execution of in-house applications. Developers often update such programs on a regular basis. Application virtualization enables extensive remote updates, installation, and distribution of critical software. As a result, this technology is crucial for enterprises that build and employ in-house applications. Closing Lines There is no doubt about the efficiency and advantages of application virtualization. You do not need to be concerned with installing the programs on your system. Moreover, you do not need to maintain the minimum requirements for running such programs since they will operate on the hosted server, giving you the impression that the application is operating on your system. There will be no performance concerns when the program runs. There will not be any overload on your system, and you will not encounter any compatibility issues as a result of your system's underlying operating system.

Read More
VMware, Vsphere, Hyper-V

Virtualizing Broadband Networks: Q&A with Tom Cloonan and David Grubb

Article | May 2, 2023

The future of broadband networks is fast, pervasive, reliable, and increasingly, virtual. Dell’Oro predicts that virtual CMTS/CCAP revenue will grow from $90 million in 2019 to $418 million worldwide in 2024. While network virtualization is still in its earliest stages of deployment, many operators have begun building their strategy for virtualizing one or more components of their broadband networks.

Read More
Virtual Desktop Tools

VM Applications for Software Development and Secure Testing

Article | August 12, 2022

Contents 1. Introduction 2. Software Development and Secure Testing 3. Using VMs in Software Development and Secure Testing 4. Conclusion 1. Introduction “Testing is an infinite process of comparing the invisible to the ambiguous in order to avoid the unthinkable happening to the anonymous.” —James Bach. Testing software is crucial for identifying and fixing security vulnerabilities. However, meeting quality standards for functionality and performance does not guarantee security. Thus, software testing nowadays is a must to identify and address application security vulnerabilities to maintain the following: Security of data history, databases, information, and servers Customers’ integrity and trust Web application protection from future attacks VMs provide a flexible and isolated environment for software development and security testing. They offer easy replication of complex configurations and testing scenarios, allowing efficient issue resolution. VMs also provide secure testing by isolating applications from the host system and enabling a reset to a previous state. In addition, they facilitate DevOps practices and streamline the development workflow. 2. Software Development and Secure Testing Software Secure Testing: The Approach The following approaches must be considered while preparing and planning for security tests: Architecture Study and Analysis: Understand whether the software meets the necessary requirements. Threat Classification: List all potential threats and risk factors that must be tested. Test Planning: Run the tests based on the identified threats, vulnerabilities, and security risks. Testing Tool Identification: For software security testing tools for web applications, the developer must identify the relevant security tools to test the software for specific use cases. Test-Case Execution: After performing a security test, the developer should fix it using any suitable open-source code or manually. Reports: Prepare a detailed test report of the security tests performed, containing a list of the vulnerabilities, threats, and issues resolved and the ones that are still pending. Ensuring the security of an application that handles essential functions is paramount. This may involve safeguarding databases against malicious attacks or implementing fraud detection mechanisms for incoming leads before integrating them into the platform. Maintaining security is crucial throughout the software development life cycle (SDLC) and must be at the forefront of developers' minds while executing the software's requirements. With consistent effort, the SDLC pipeline addresses security issues before deployment, reducing the risk of discovering application vulnerabilities while minimizing the damage they could cause. A secure SDLC makes developers responsible for critical security. Developers need to be aware of potential security concerns at each step of the process. This requires integrating security into the SDLC in ways that were not needed before. As anyone can potentially access source code, coding with potential vulnerabilities in mind is essential. As such, having a robust and secure SDLC process is critical to ensuring applications are not subject to attacks by hackers. 3. Using VMs in Software Development and Secure Testing: Snapshotting: Snapshotting allows developers to capture a VM's state at a specific point in time and restore it later. This feature is helpful for debugging and enables developers to roll back to a previous state when an error occurs. A virtual machine provides several operations for creating and managing snapshots and snapshot chains. These operations let users create snapshots, revert to any snapshots in the chain, and remove snapshots. In addition, extensive snapshot trees can be created to streamline the flow. Virtual Networking: It allows virtual machines to be connected to virtual networks that simulate complex network topologies, allowing developers to test their applications in different network environments. This allows expanding data centers to cover multiple physical locations, gaining access to a plethora of more efficient options. This empowers them to effortlessly modify the network as per changing requirements without any additional hardware. Moreover, providing the network for specific applications and needs offers greater flexibility. Additionally, it enables workloads to be moved seamlessly across the network infrastructure without compromising on service, security, or availability. Resource Allocation: VMs can be configured with specific resource allocations such as CPU, RAM, and storage, allowing developers to test their applications under different resource constraints. Maintaining a 1:1 ratio between the virtual machine processor and its host or core is highly recommended. It's crucial to refrain from over-subscribing virtual machine processors to a single core, as this could lead to stalled or delayed events, causing significant frustration and dissatisfaction among users. However, it is essential to acknowledge that IT administrators sometimes overallocate virtual machine processors. In such cases, a practical approach is to start with a 2:1 ratio and gradually move towards 4:1, 8:1, 12:1, and so on while bringing virtual allocation into IT infrastructure. This approach ensures a safe and seamless transition towards optimized virtual resource allocation. Containerization within VMs: Containerization within VMs provides an additional layer of isolation and security for applications. Enterprises are finding new use cases for VMs to utilize their in-house and cloud infrastructure to support heavy-duty application and networking workloads. This will also have a positive impact on the environment. DevOps teams use containerization with virtualization to improve software development flexibility. Containers allow multiple apps to run in one container with the necessary components, such as code, system tools, and libraries. For complex applications, both virtual machines and containers are used together. However, while containers are used for the front-end and middleware, VMs are used for the back-end. VM Templates: VM templates are pre-configured virtual machines that can be used as a base for creating new virtual machines, making it easier to set up development and testing environments. A VM template is an image of a virtual machine that serves as a master copy. It includes VM disks, virtual devices, and settings. By using a VM template, cloning a virtual machine multiple times can be achieved. When you clone a VM from a template, the clones are independent and not linked to the template. VM templates are handy when a large number of similar VMs need to be deployed. They preserve VM consistency. To edit a template, convert it to a VM, make the necessary changes, and then convert the edited VM back into a new template. Remote Access: VMs can be accessed remotely, allowing developers and testers to collaborate more effectively from anywhere worldwide. To manage a virtual machine, follow these steps: enable remote access, connect to the virtual machine, and then access the VNC or serial console. Once connected, full permission to manage the virtual machine is granted with the user's approval. Remote access provides a secure way to access VMs, as connections can be encrypted and authenticated to prevent unauthorized access. Additionally, remote access allows for easier management of VMs, as administrators can monitor and control virtual machines from a central location. DevOps Integration: DevOps is a collection of practices, principles, and tools that allow a team to release software quickly and efficiently. Virtualization is vital in DevOps when developing intricate cloud, API, and SOA systems. Virtual machines enable teams to simulate environments for creating, testing, and launching code, ultimately preserving computing resources. While commencing a bug search at the API layer, teams find that virtual machines are suitable for test-driven development (TDD). Virtualization providers handle updates, freeing up DevOps teams, to focus on other areas and increasing productivity by 50 –60%. In addition, VMs allow for simultaneous testing of multiple release and patch levels, improving product compatibility and interoperability. 4. Conclusion The outlook for virtual machine applications is highly promising in the development and testing fields. With the increasing complexity of development and testing processes, VMs can significantly simplify and streamline these operations. In the future, VMs are expected to become even more versatile and potent, providing developers and testers with a broader range of tools and capabilities to facilitate the development process. One potential future development is integrating machine learning and artificial intelligence into VMs. This would enable VMs to automate various tasks, optimize the allocation of resources, and generate recommendations based on performance data. Moreover, VMs may become more agile and lightweight, allowing developers and testers to spin up and spin down instances with greater efficiency. The future of VM applications for software development and security testing looks bright, with continued innovation and development expected to provide developers and testers with even more powerful and flexible tools to improve the software development process.

Read More
Server Virtualization

How to Start Small and Grow Big with Data Virtualization

Article | June 9, 2022

Why Should Companies Care about Data Virtualization? Data is everywhere. With each passing day, companies generate more data than ever before, and what exactly can they do with all this data? Is it just a matter of storing it? Or should they manage and integrate their data from the various sources? How can they store, manage, integrate and utilize their data to gain information that is of critical value to their business? As they say, knowledge is power, but knowledge without action is useless. This is where the Denodo Platform comes in. The Denodo Platform gives companies the flexibility to evolve their data strategies, migrate to the cloud, or logically unify their data warehouses and data lakes, without affecting business. This powerful platform offers a variety of subscription options that can benefit companies immensely. For example, companies often start out with individual projects using a Denodo Professional subscription, but in a short period of time they end up adding more and more data sources and move on to other Denodo subscriptions such as Denodo Enterprise or Denodo Enterprise Plus. The upgrade process is very easy to establish; in fact, it can be done in less than a day once the cloud marketplace is chosen (Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP). In as little as six weeks companies can realize real business benefits from managing and utilizing their data effectively. A Bridging Layer Data virtualization has been around for quite some time now. Denodo’s founders, Angel Viña and Alberto Pan, have been involved in data virtualization from as far back as the 1990’s. If you’re not familiar with data virtualization, here is a quick summary. Data virtualization is the cornerstone to a logical data architecture, whether it be a logical data warehouse, logical data fabric, data mesh, or even a data hub. All of these architectures are best served by our principals Combine (bring together all your data sources), Connect (into a logical single view) and Consume (through standard connectors to your favorite BI/data science tools or through our easy-to-use robust API’s). Data virtualization is the bridge that joins multiple data sources to fuel analytics. It is also the logical data layer that effectively integrates data silos across disparate systems, manages unified data for centralized security, and delivers it to business users in real time. Economic Benefits in Less Than 6 weeks with Data Virtualization? In a short duration, how can companies benefit from choosing data virtualization as a data management solution? To answer this question, below are some very interesting KPI’s discussed in the recently released Forrester study on the Total Economic Impact of Data Virtualization. For example, companies that have implemented data virtualization have seen an 83% increase in business user productivity. Mainly this is due to the business-centric way a data virtualization platform is delivered. When you implement data virtualization, you provide business users with an easy to access democratized interface to their data needs. The second KPI to note is a 67% reduction in development resources. With data virtualization, you connect to the data, you do not copy it. This means once it is set up, there is a significant reduction in the need for data integration engineers, as data remains in the source location and is not copied around the enterprise. Finally, companies are reporting a 65% improvement in data access speeds above and beyond more traditional approaches such as extract, transform, and load (ETL) processes. A Modern Solution for an Age-Old Problem To understand how data virtualization can help elevate projects to an enterprise level, we can share a few use cases in which companies have leveraged data virtualization to solve their business problems across several different industries. For example, in finance and banking we often see use cases in which data virtualization can be used as a unifying platform to help improve compliance and reporting. In retail, we see use cases including predictive analytics in supply chains as well as next and best actions from a unified view of the customer. There are many uses for data virtualization in a wider variety of situations, such as in healthcare and government agencies. Companies use the Denodo Platform to help data scientists understand key trends and activities, both sociologically as well as economically. In a nutshell, if data exists in more than one source, then the Denodo Platform acts as the unifying platform that connects, combines and allows users to consume the data in a timely, cost-effective manner.

Read More

Spotlight

Searchlight Cyber

We provide organizations with relevant and actionable dark web threat intelligence to help them identify and prevent criminal activity. Founded in 2017 with a mission to stop criminals acting with impunity on the dark web, we have been involved in some of the world’s largest dark web investigations and have the most comprehensive dataset based on proprietary techniques and ground-breaking academic research.

Related News

Virtualized Environments

VeriSilicon Unveils the New VC9800 IP for Next Generation Data Centers

Business Wire | January 09, 2024

VeriSilicon today unveiled its latest VC9800 series Video Processor Unit (VPU) IP with enhanced video processing performance to strengthen its presence in the data center applications. The newly launched series IP caters to the advanced requirements of next generation data centers including video transcoding servers, AI servers, virtual cloud desktops, and cloud gaming. The VC9800 series of VPU IP boasts high performance, high throughput, and server-level multi-stream encoding and decoding capabilities. It can handle up to 256 streams and support all mainstream video formats, including the new advanced format VVC. Through Rapid Look Ahead encoding, the VC9800 series IP improves video quality significantly with low memory footprint and encoding latency. With capable of supporting 8K encoding and decoding, it offers enhanced video post-processing and multi-channel encoding at various resolutions, thus achieves an efficient transcoding solution. The VC9800 series of VPU IP can seamlessly interface with Neural Network Processor (NPU) IP, enabling a complete AI-video pipeline. When combined with VeriSilicon’s Graphics Processor Unit (GPU) IP, the subsystem solution is able to deliver enhanced gaming experiences. In addition, the hardware virtualization, super resolution image enhancement, and AI-enabled encoding functions of this series IP also offer effective solutions for virtual cloud desktops. “VeriSilicon’s advanced video transcoding technology continues leading in Data Center domain. We are working closely with global leading customers to develop comprehensive video processing subsystem solutions to meet the requirements of the latest Data Centers,” said Wei-Jin Dai, Executive VP and GM of IP Division of VeriSilicon. “For AI computing, our video post-processing capabilities have been extended to smoothly interact with NPUs, ensuring OpenCV-level accuracy. We’ve also introduced super resolution technology to the video processing subsystem, elevating image quality and ultimately enhancing user experiences for cloud computing and smart display.” About VeriSilicon VeriSilicon is committed to providing customers with platform-based, all-around, one-stop custom silicon services and semiconductor IP licensing services leveraging its in-house semiconductor IP.

Read More

Server Virtualization

Panasonic Automotive Introduces Neuron High-Performance Compute (HPC) to Advance to a Software-Defined Mobility Future

PR Newswire | January 09, 2024

Panasonic Automotive Systems Company of America, a tier-one automotive supplier and a division of Panasonic Corporation of North America, announced its High-Performance Compute (HPC) system. Named Neuron, this innovation addresses the rapidly evolving mobility needs anticipated for software-defined vehicle advancements. As vehicles become more software reliant, vehicle systems must support the extended software lifecycle by enabling software upgrades and prolonging the supporting hardware capability. Cars rely on hardware and software compute platforms to process, share, sense, and derive insights to handle functions for assisted driving. Panasonic Automotive's Neuron HPC allows for not only software updates and upgrades but also hardware upgrades across platform lifecycles. The Neuron HPC can aggregate multiple computing zones to reduce the cost, weight and integration complexity of the vehicle by removing redundant components. Panasonic Automotive's design supports effortless up-integration with high-performance and heavy data input processing capability. Importantly, the design is upgradeable, scalable and future-proof across today's evolving in-vehicle platforms. Neuron HPC Architecture & Design Panasonic Automotive's High Performance Compute architecture could reduce the number of distributed electronic control units (ECUs) by up to 80%1 – allowing for faster, lighter, cross-domain computing for real-time, cross-functional communications. The Neuron HPC design is suited for any mobility platform including internal combustion engine, hybrid, fuel cell or electric vehicles. "In collaboration with OEMs, Panasonic Automotive has designed and met some of the largest central compute platform challenges in the industry in order to make the driving experience evolve with technology," said Andrew Poliak, CTO, Panasonic Automotive Systems Company of America. "Neuron maximizes performance, safety and innovation over the entire ownership of the consumer's vehicle and enables OEMs with a future-proof SDV platform for ensuing generations of mobility needs." Key Systems, UX Features & Technical Benefits With a streamlined design, the Neuron HPC incorporates up-integration capability by consolidating multiple ECUs into one centralized nucleus to handle all levels of ADAS, chassis, body, and in-cabin infotainment features. About Panasonic Automotive Systems Company of America  Panasonic Automotive Systems Company of America is a division company of Panasonic Corporation of North America and is a leading global supplier of automotive infotainment and connectivity system solutions. Panasonic Automotive Systems Company of America acts as the North American affiliate of Panasonic Automotive Systems Co., Ltd., which coordinates global automotive. Panasonic Automotive Systems Company of America is headquartered in Peachtree City, Georgia, with sales, marketing and engineering operations in Farmington Hills, Mich. About Panasonic Corporation of North America Newark, NJ-based Panasonic Corporation of North America is committed to creating a better life and a better world by enabling its customers through innovations in Sustainable Energy, Immersive Entertainment, Integrated Supply Chains and Mobility Solutions. The company is the principal North American subsidiary of Osaka, Japan-based Panasonic Corporation. One of Interbrand's Top 100 Best Global Brands of 2023, Panasonic is a leading technology partner and integrator to businesses, government agencies and consumers across the region.

Read More

Server Virtualization

AELF Partners with ChainsAtlas to Pioneer Interoperability in Blockchain

PR Newswire | January 09, 2024

aelf is advancing cross-chain interoperability through a strategic partnership with ChainsAtlas. By utilising ChainsAtlas' innovative virtualisation technology, aelf will enable decentralised applications (dApps) from diverse blockchains to seamlessly migrate and integrate into the aelf blockchain, regardless of the dApps' smart contract specifications. This collaboration marks a significant step towards a globally interconnected and efficient blockchain ecosystem, breaking down the silos between blockchains. Khaniff Lau, Business Development Director at aelf, shares, "The strategic partnership with ChainsAtlas is a significant step towards realising our vision of a seamlessly interconnected blockchain world. With this integration, aelf is set to become a hub for cross-chain activities, enhancing our ability to support a wide array of dApps, digital assets, and Web2 apps. This collaboration is not just about technology integration; it's about shaping the future of how services and products on blockchains interact and operate in synergy." Jan Hanken, Co-founder of ChainsAtlas, says, "ChainsAtlas was always built to achieve two major goals: to make blockchain development accessible to a broad spectrum of developers and entrepreneurs and, along that path, to pave the way for a truly omnichain future." "By joining forces with aelf, we are bringing that visionary future much closer to reality. As we anticipate the influx of creativity from innovators taking their first steps into the world of Web3 on aelf, driven by ChainsAtlas technology, we are excited to see these groundbreaking ideas come to life," adds Hanken. The foundation for true cross-chain interoperability is being built as aelf integrates ChainsAtlas' Virtualization Unit (VU), enabling the aelf blockchain to accommodate both EVM and non-EVM digital assets. This cross-chain functionality is accomplished through ChainsAtlas' virtualisation technology, allowing aelf to interpret and execute smart contracts written in other languages supported by ChainsAtlas, while also establishing state transfer mechanisms that facilitate seamless data and asset flow between aelf and other blockchains. Through this partnership, aelf blockchain's capabilities will be enhanced as it is able to support a more comprehensive range of dApps and games, and developers from diverse coding backgrounds will now be empowered to build on aelf blockchain. This partnership will also foster increased engagement within the Web3 community as users can gain access to a more diverse range of digital assets on aelf. Looking ahead, the partnership between aelf and ChainsAtlas will play a pivotal role in advancing the evolution of aelf's sidechains by enabling simultaneous execution of program components across multiple VUs on different blockchains. About aelf aelf, a high-performance Layer 1 featuring multi-sidechain technology for unlimited scalability. aelf blockchain is designed to power the development of Web3 and support its continuous advancement into the future. Founded in 2017 with its global hub based in Singapore, aelf is one of the pioneers of the mainchain-sidechain architecture concept. Incorporating key foundational components, including AEDPoS, aelf's variation of a Delegated Proof-of-Stake (DPoS) consensus protocol; parallel processing; peer-to-peer (P2P) network communication; cross-chain bridges, and a dynamic side chain indexing mechanism, aelf delivers a highly efficient, safe, and modular ecosystem with high throughput, scalability, and interoperability. aelf facilitates the building, integrating, and deploying of smart contracts and decentralised apps (dApps) on its blockchain with its native C# software development kit (SDK) and SDKs in other languages, including Java, JS, Python, and Go. aelf's ecosystem also houses a range of dApps to support a flourishing blockchain network. aelf is committed to fostering innovation within its ecosystem and remains dedicated to driving the development of Web3 and the adoption of blockchain technology. About ChainsAtlas ChainsAtlas introduces a new approach to Web3 infrastructure, blending multiple blockchain technologies and smart contract features to create a unified, efficient processing network. Its core innovation lies in virtualization-enabled smart contracts, allowing consistent software operation across different blockchains. This approach enhances decentralized applications' complexity and reliability, promoting easier integration of existing software into the blockchain ecosystem. The team behind ChainsAtlas, driven by the transformative potential of blockchain, aims to foster global opportunities and equality. Their commitment to building on existing blockchain infrastructure marks a significant step towards a new phase in Web3, where advanced and reliable decentralized applications become the norm, setting new standards for the future of decentralized networks.

Read More

Virtualized Environments

VeriSilicon Unveils the New VC9800 IP for Next Generation Data Centers

Business Wire | January 09, 2024

VeriSilicon today unveiled its latest VC9800 series Video Processor Unit (VPU) IP with enhanced video processing performance to strengthen its presence in the data center applications. The newly launched series IP caters to the advanced requirements of next generation data centers including video transcoding servers, AI servers, virtual cloud desktops, and cloud gaming. The VC9800 series of VPU IP boasts high performance, high throughput, and server-level multi-stream encoding and decoding capabilities. It can handle up to 256 streams and support all mainstream video formats, including the new advanced format VVC. Through Rapid Look Ahead encoding, the VC9800 series IP improves video quality significantly with low memory footprint and encoding latency. With capable of supporting 8K encoding and decoding, it offers enhanced video post-processing and multi-channel encoding at various resolutions, thus achieves an efficient transcoding solution. The VC9800 series of VPU IP can seamlessly interface with Neural Network Processor (NPU) IP, enabling a complete AI-video pipeline. When combined with VeriSilicon’s Graphics Processor Unit (GPU) IP, the subsystem solution is able to deliver enhanced gaming experiences. In addition, the hardware virtualization, super resolution image enhancement, and AI-enabled encoding functions of this series IP also offer effective solutions for virtual cloud desktops. “VeriSilicon’s advanced video transcoding technology continues leading in Data Center domain. We are working closely with global leading customers to develop comprehensive video processing subsystem solutions to meet the requirements of the latest Data Centers,” said Wei-Jin Dai, Executive VP and GM of IP Division of VeriSilicon. “For AI computing, our video post-processing capabilities have been extended to smoothly interact with NPUs, ensuring OpenCV-level accuracy. We’ve also introduced super resolution technology to the video processing subsystem, elevating image quality and ultimately enhancing user experiences for cloud computing and smart display.” About VeriSilicon VeriSilicon is committed to providing customers with platform-based, all-around, one-stop custom silicon services and semiconductor IP licensing services leveraging its in-house semiconductor IP.

Read More

Server Virtualization

Panasonic Automotive Introduces Neuron High-Performance Compute (HPC) to Advance to a Software-Defined Mobility Future

PR Newswire | January 09, 2024

Panasonic Automotive Systems Company of America, a tier-one automotive supplier and a division of Panasonic Corporation of North America, announced its High-Performance Compute (HPC) system. Named Neuron, this innovation addresses the rapidly evolving mobility needs anticipated for software-defined vehicle advancements. As vehicles become more software reliant, vehicle systems must support the extended software lifecycle by enabling software upgrades and prolonging the supporting hardware capability. Cars rely on hardware and software compute platforms to process, share, sense, and derive insights to handle functions for assisted driving. Panasonic Automotive's Neuron HPC allows for not only software updates and upgrades but also hardware upgrades across platform lifecycles. The Neuron HPC can aggregate multiple computing zones to reduce the cost, weight and integration complexity of the vehicle by removing redundant components. Panasonic Automotive's design supports effortless up-integration with high-performance and heavy data input processing capability. Importantly, the design is upgradeable, scalable and future-proof across today's evolving in-vehicle platforms. Neuron HPC Architecture & Design Panasonic Automotive's High Performance Compute architecture could reduce the number of distributed electronic control units (ECUs) by up to 80%1 – allowing for faster, lighter, cross-domain computing for real-time, cross-functional communications. The Neuron HPC design is suited for any mobility platform including internal combustion engine, hybrid, fuel cell or electric vehicles. "In collaboration with OEMs, Panasonic Automotive has designed and met some of the largest central compute platform challenges in the industry in order to make the driving experience evolve with technology," said Andrew Poliak, CTO, Panasonic Automotive Systems Company of America. "Neuron maximizes performance, safety and innovation over the entire ownership of the consumer's vehicle and enables OEMs with a future-proof SDV platform for ensuing generations of mobility needs." Key Systems, UX Features & Technical Benefits With a streamlined design, the Neuron HPC incorporates up-integration capability by consolidating multiple ECUs into one centralized nucleus to handle all levels of ADAS, chassis, body, and in-cabin infotainment features. About Panasonic Automotive Systems Company of America  Panasonic Automotive Systems Company of America is a division company of Panasonic Corporation of North America and is a leading global supplier of automotive infotainment and connectivity system solutions. Panasonic Automotive Systems Company of America acts as the North American affiliate of Panasonic Automotive Systems Co., Ltd., which coordinates global automotive. Panasonic Automotive Systems Company of America is headquartered in Peachtree City, Georgia, with sales, marketing and engineering operations in Farmington Hills, Mich. About Panasonic Corporation of North America Newark, NJ-based Panasonic Corporation of North America is committed to creating a better life and a better world by enabling its customers through innovations in Sustainable Energy, Immersive Entertainment, Integrated Supply Chains and Mobility Solutions. The company is the principal North American subsidiary of Osaka, Japan-based Panasonic Corporation. One of Interbrand's Top 100 Best Global Brands of 2023, Panasonic is a leading technology partner and integrator to businesses, government agencies and consumers across the region.

Read More

Server Virtualization

AELF Partners with ChainsAtlas to Pioneer Interoperability in Blockchain

PR Newswire | January 09, 2024

aelf is advancing cross-chain interoperability through a strategic partnership with ChainsAtlas. By utilising ChainsAtlas' innovative virtualisation technology, aelf will enable decentralised applications (dApps) from diverse blockchains to seamlessly migrate and integrate into the aelf blockchain, regardless of the dApps' smart contract specifications. This collaboration marks a significant step towards a globally interconnected and efficient blockchain ecosystem, breaking down the silos between blockchains. Khaniff Lau, Business Development Director at aelf, shares, "The strategic partnership with ChainsAtlas is a significant step towards realising our vision of a seamlessly interconnected blockchain world. With this integration, aelf is set to become a hub for cross-chain activities, enhancing our ability to support a wide array of dApps, digital assets, and Web2 apps. This collaboration is not just about technology integration; it's about shaping the future of how services and products on blockchains interact and operate in synergy." Jan Hanken, Co-founder of ChainsAtlas, says, "ChainsAtlas was always built to achieve two major goals: to make blockchain development accessible to a broad spectrum of developers and entrepreneurs and, along that path, to pave the way for a truly omnichain future." "By joining forces with aelf, we are bringing that visionary future much closer to reality. As we anticipate the influx of creativity from innovators taking their first steps into the world of Web3 on aelf, driven by ChainsAtlas technology, we are excited to see these groundbreaking ideas come to life," adds Hanken. The foundation for true cross-chain interoperability is being built as aelf integrates ChainsAtlas' Virtualization Unit (VU), enabling the aelf blockchain to accommodate both EVM and non-EVM digital assets. This cross-chain functionality is accomplished through ChainsAtlas' virtualisation technology, allowing aelf to interpret and execute smart contracts written in other languages supported by ChainsAtlas, while also establishing state transfer mechanisms that facilitate seamless data and asset flow between aelf and other blockchains. Through this partnership, aelf blockchain's capabilities will be enhanced as it is able to support a more comprehensive range of dApps and games, and developers from diverse coding backgrounds will now be empowered to build on aelf blockchain. This partnership will also foster increased engagement within the Web3 community as users can gain access to a more diverse range of digital assets on aelf. Looking ahead, the partnership between aelf and ChainsAtlas will play a pivotal role in advancing the evolution of aelf's sidechains by enabling simultaneous execution of program components across multiple VUs on different blockchains. About aelf aelf, a high-performance Layer 1 featuring multi-sidechain technology for unlimited scalability. aelf blockchain is designed to power the development of Web3 and support its continuous advancement into the future. Founded in 2017 with its global hub based in Singapore, aelf is one of the pioneers of the mainchain-sidechain architecture concept. Incorporating key foundational components, including AEDPoS, aelf's variation of a Delegated Proof-of-Stake (DPoS) consensus protocol; parallel processing; peer-to-peer (P2P) network communication; cross-chain bridges, and a dynamic side chain indexing mechanism, aelf delivers a highly efficient, safe, and modular ecosystem with high throughput, scalability, and interoperability. aelf facilitates the building, integrating, and deploying of smart contracts and decentralised apps (dApps) on its blockchain with its native C# software development kit (SDK) and SDKs in other languages, including Java, JS, Python, and Go. aelf's ecosystem also houses a range of dApps to support a flourishing blockchain network. aelf is committed to fostering innovation within its ecosystem and remains dedicated to driving the development of Web3 and the adoption of blockchain technology. About ChainsAtlas ChainsAtlas introduces a new approach to Web3 infrastructure, blending multiple blockchain technologies and smart contract features to create a unified, efficient processing network. Its core innovation lies in virtualization-enabled smart contracts, allowing consistent software operation across different blockchains. This approach enhances decentralized applications' complexity and reliability, promoting easier integration of existing software into the blockchain ecosystem. The team behind ChainsAtlas, driven by the transformative potential of blockchain, aims to foster global opportunities and equality. Their commitment to building on existing blockchain infrastructure marks a significant step towards a new phase in Web3, where advanced and reliable decentralized applications become the norm, setting new standards for the future of decentralized networks.

Read More

Events