What’s Behind the Numbers For Vmware Inc (VMW)?

Glenrock Gazette | August 20, 2019

Vmware Inc (VMW)’s MACD Histogram reading is currently below the zero line, indicating a neutral or negative chart trend for the shares. Shares recently touched 145.14 on a recent bid, moving 1.42 in the most recent session.Created by Thomas Aspray in 1986, the MACD Histogram is a visual indicator of the difference between the MACD line and the Signal line, which is a default 9 period ema of the MACD line. The histogram is an oscillator that moves above and below the zero line, just as the MACD line does. Keep in mind when using this oscillator, that it takes four mathematical steps from price itself to create the 4th derivative, the histogram: Price => two ema averages => MACD line = Signal line => Histogram. Which means it lags price quite a bit. But like all derivatives of price, it’s much smoother than price itself. If the MACD is above zero it helps confirm an uptrend; below zero and it helps confirm a downtrend. Zero line and Signal line crossovers are used as trade signals to enter and exit trending trades. Losing trade signals occur when crossovers occur in rapid succession due to choppy price action. Divergence shows when momentum is slowing, but it doesn’t indicate when a reversal will occur (if it occurs).

Spotlight

The Virtualization Station is composed of a QNAP NAS and Virtualization technology and turns the QNAP NAS into an appliance server. The Virtualization Station provides you to create or import/export virtual machines on QNAP Turbo NAS and supports different operating systems of virtual machines such as Windows, Linux and UNIX.


Other News
VPN

VIAVI and VMware Announce Testbed as a Service for RAN Intelligent Controller Testing

Viavi | November 25, 2022

Viavi Solutions Inc.today announced that it has signed a partnership agreement with VMware to drive standardized frameworks and metrics for RAN Intelligent Controller (RIC) testing. This testbed as a service will enable mobile operators to introduce programmability to the RAN and help accelerate the adoption of Open RAN. The RIC is a cloud-native central component of an open and virtualized RAN network, enabling the optimization of RAN resources through analytic processing and adaptation recommendations. The RIC takes advantage of native and third-party xApps and rApps – microservice-based applications operating in near-real time (near-RT) and non-real-time (non-RT), respectively – to enable operators to automate and optimize RAN operations at scale to reduce the operator's total cost of ownership, and to introduce innovative new services. VMware is focused on attracting and collaborating with a vibrant ecosystem of partners to help its operator customers adopt Open RAN with complete confidence. VIAVI has the most comprehensive portfolio of Open RAN test solutions in the industry and plays a leading role in defining test processes in the O-RAN ALLIANCE and Telecom Infra Project (TIP). The two companies will work together to demonstrate compliance with RIC-related requirements, assisting CSPs in validating the solution in the lab and scaling the solution to production. The industry-leading VIAVI TeraVM RIC Test and the VMware RIC will form a joint testbed as a service for testing, profiling, and validating third-party xApps and rApps. In addition to the framework, the two companies will work together to drive industry consensus around testing methodology and performance metrics. By having pre-built test cases and a standardized test method for the RIC and xApp/rApp, operators can reduce the time it takes to validate the solution in their lab, meaning they can move to a production environment faster. "Open RAN, by definition, depends on strong collaboration to drive innovation, and that's a perfect way to think about this partnership between leaders in their respective fields, The RIC represents a huge opportunity to the industry: Applying AI/ML techniques allows operators to simplify the management of complex 5G configurations and dynamically optimize the network to cater for new use cases, energy efficiency, and changing traffic patterns." Ian Langley, Senior Vice President and General Manager, Wireless Business, VIAVI "We're excited to work with VIAVI on helping move the industry forward to accelerate the adoption of Open RAN," said Lakshmi Mandyam, vice president, Service Provider Product Management and Partner Ecosystem, VMware. "Our companies share a vision of what it will take to address the challenges hindering adoption by simplifying the path for CSPs to test, profile, and certify third-party xApps and rApps through a common framework. VIAVI's leadership in Open RAN testing and VMware's leadership in RIC make this an ideal collaboration." About VIAVI VIAVI (NASDAQ: VIAV) is a global provider of network test, monitoring and assurance solutions for communications service providers, hyperscalers, equipment manufacturers, enterprises, government and avionics. VIAVI is also a leader in light management technologies for 3D sensing, anti-counterfeiting, consumer electronics, industrial, automotive, government and aerospace applications. Together with our customers and partners we are United in Possibility, finding innovative ways to solve real-world problems.

Read More

VPN

Veracode Launches Container Security Offering That Secures Cloud-Native Application Development

Veracode | October 11, 2022

Veracode, a leading global provider of application security testing solutions, today announced the enhancement of its Continuous Software Security Platform to include container security. This early access program for Veracode Container Security is now underway for existing customers. The new Veracode Container Security offering, designed to meet the needs of cloud-native software engineering teams, addresses vulnerability scanning, secure configuration, and secrets management requirements for container images. Veracode Chief Product Officer, Brian Roche, said, “As developers embrace cloud-native computing practices, containers have become increasingly important for business efficiency. This launch helps close a substantial gap in the market for developer-friendly solutions that cover critical capabilities for container security. We are excited to bring this next enhancement of our platform to the market and empower customers to address security testing for more modern architectures and deployment styles.” The Requirement for Container Security is Rapidly on the Rise Containers are increasingly used to simplify software deployment and runtime environment configuration management. They comprise small, fast, portable units of software in which code is packaged so that an application can be run quickly and reliably in different computing environments—from the desktop to the cloud. They provide an ecosystem of repositories, orchestration technologies, and capabilities that address related issues, such as service-to-service communication and configuration management. Instantiated in pipelines from code, containers have the benefit of immutability, meaning they are not updated, reconfigured or patched in production. Instead, the underlying image is updated with new capabilities and redeployed, helping to improve efficiency in the production environment. Despite the benefits of containers, they are affected by many of the same problems that traditionally plague physical production or virtual server hardware, such as vulnerabilities introduced through additional software, poorly managed secrets (like Amazon Web Services keys and credentials in Dockerfiles), and security misconfigurations. This has resulted in increased demand for products that address these issues and related problems, with the Global Container Security Market size expected to reach $3.9 billion by 2027*. Container security scanning analyzes container images against organizational or industry-specific standards to identify insecure processes, misconfigurations that could lead to a vulnerability, and inadequate authentication and access control. Veracode Container Security Integrates into the Developer Environment Many products already in the market are aimed at securing containers in runtime and offer limited support for developers, posing a major challenge for early remediation. Veracode’s solution instead integrates into the CI/CD (continuous integration and continuous delivery) pipeline and is available at the command line interface. Providing coverage for vulnerability detection and remediation, secrets management, and security configuration issues on the most popular operating systems, it delivers remediation advice to developers early in the software development life cycle so that insecure containers don’t ship to production. Veracode Container Security results are available in a variety of formats based on the user’s choice, including text, JSON (JavaScript Object Notation), and Software Bill of Materials (CycloneDX, SWID [Software Identification Tagging], or SPDX [Software Packaging Data Exchange]), making them easy to integrate with other tools. Providing developers and their teams with the tools to meet their specific needs means they can find and fix vulnerabilities early in the lifecycle, giving them confidence that their containerized application environment is secure. “Veracode Container Security will be instrumental for our developers to ensure that the workloads they deploy into our cloud are secure,” said the Director of Information Security at an automotive company. “Without this tool, it would take our team weeks to receive and action container results and these would only have been available in limited formats. Now, we’re excited to integrate findings into the pipeline before they even move into production, creating time and cost efficiencies for our business.” About Veracode Veracode is a leading AppSec partner for creating secure software, reducing the risk of security breach, and increasing security and development teams’ productivity. As a result, companies using Veracode can move their business, and the world, forward. With its combination of process automation, integrations, speed, and responsiveness, Veracode helps companies get accurate and reliable results to focus their efforts on fixing, not just finding, potential vulnerabilities.

Read More

VIRTUAL SERVER INFRASTRUCTURE

Red Hat Introduces Lightweight Kubernetes Solution to Power the Next Evolution of Open Edge Computing

Red Hat | October 28, 2022

Red Hat, Inc., the world's leading provider of open source solutions, today introduced Red Hat Device Edge, a solution for flexibly deploying traditional or containerized workloads on small devices such as robots, IoT gateways, points of sale, public transport and more. Red Hat Device Edge delivers an enterprise-ready and supported distribution of the Red Hat-led open source community project MicroShift, a lightweight Kubernetes orchestration solution built from the edge capabilities of Red Hat OpenShift, along with an edge-optimized operating system built from Red Hat Enterprise Linux. This latest product in the Red Hat edge portfolio aims to provide a future-proof platform that allows organizations’ architecture to evolve as their workload strategy changes. As more and more companies deploy edge computing across a broader range of use cases, many new questions, operational needs and business challenges are poised to arise. In industries like automotive, manufacturing and more, organizations are up against different environmental, security and operational challenges that require an ability to work with small form-factor edge devices in these resource constrained environments. Ultimately, different devices have different requirements in terms of computing power, software compatibility and security footprint. With Red Hat Device Edge, organizations can have the flexibility to deploy containers at the edge in a small footprint, reducing compute requirements by up to 50% in comparison to traditional Kubernetes edge configurations. It also helps to address many of the emerging questions around large-scale edge computing at the device edge by incorporating: Kubernetes built for edge deployments, enabling IT teams to use familiar Kubernetes features in a new, smaller, lighter-weight footprint offered by MicroShift. This lowers the barrier of entry for teams building cloud-native applications for edge computing environments and enables them to use existing Kubernetes skills to achieve greater consistency of operations across the entirety of the hybrid cloud, from the datacenter to public clouds to the edge. An edge-optimized Linux OS built from the world’s leading enterprise Linux platform in Red Hat Enterprise Linux and tailored for small edge devices with intelligent updates that use minimum bandwidth. This helps organizations tackle the challenges of intermittent connectivity while mitigating the impact on edge innovation. Capabilities for centrally scaling and monitoring edge device fleets with Red Hat Smart Management. IT teams can use zero-touch provisioning, system health visibility and updates with automatic rollbacks to maintain a stronger edge management and application security posture. Red Hat Device Edge for far-flung, resource constrained use cases across different industries Red Hat Device Edge was created to help Red Hat customers and partners tackle their most challenging edge environments. For example, Lockheed Martin has been collaborating with Red Hat in the MicroShift project community and is also working to deploy Red Hat Device Edge to modernize and standardize its application delivery and AI workloads in extreme conditions including wildland fire management, contested military environments and space. Additionally, ABB is planning to use Red Hat Device Edge for ABB Ability™ Edgenius™ on resource constrained devices. Edgenius is a comprehensive edge platform for industrial software applications. Red Hat Device Edge will be aimed at organizations who require small factor edge devices with support for bare metal, virtualized or containerized applications, regardless of industry. Additional use cases include but are not limited to: Miniature, connected nodes on public transportation where edge devices are often in motion but still need faster processing via AI/ML to analyze data locally in real time (i.e. railways, mining, cars, drones). Resilient resource nodes at challenging locations like weather monitoring stations where, in spite of the harsh, tough-to-support environments, an edge device will still be capable of taking care of itself with the ability to perform automated software rollbacks, maintain a stronger security posture and better enforce sensitive data controls. Emerging edge constrained scenarios where thousands of edge devices may be running applications in locations that make weight, temperature and connectivity all major concerns. Red Hat Device Edge meets organizations wherever they are today in their edge computing journey, as it will run a wide variety of workloads using Podman for edge container management or MicroShift for a Kubernetes API. Customers will even be able to use legacy windows applications within a virtual machine. Availability Red Hat Device Edge is planned as a developer preview early next year, and expected to be generally available with full support later in 2023. Supporting Quotes Francis Chow, vice president and general manager, In-Vehicle Operating System and Edge, Red Hat “Innovation at the edge has introduced new benefits and use cases for organizations across all industries - but it has also created new challenges. Working with our customers and partners, Red Hat began a journey to develop a new technology offering - designed specifically for the edge - extending our hybrid cloud solution so that our ecosystem will be able to take advantage of intelligence paired with trusted open source technology to tackle their smallest footprint remote edge use cases. Tested in the community, now produced by Red Hat, Red Hat Device Edge is a major next step in harnessing the full range of benefits promised by edge computing.” Justin Taylor, vice president of Artificial Intelligence, Lockheed Martin “Red Hat Device Edge will enable Lockheed Martin to revolutionize artificial intelligence processing for our DOD customers’ most challenging missions. The ability for small military platforms to handle large AI workloads will increase their capacity in the field, ensuring our military can stay ahead of evolving threats. Lockheed Martin is a long-time customer and collaborator to Red Hat and working on Red Hat Device Edge together is a critical next step in this strategic relationship.” Bernhard Eschermann, chief technology officer, ABB Process Automation “ABB is excited to continue working with Red Hat to streamline the transition from automated to autonomous operations for the manufacturing industry. With Red Hat Device Edge, ABB will be able to connect cloud and control environments to improve asset efficiency and operations by aggregating and analyzing data on devices with more limited resources. With this continued collaboration, ABB’s ecosystem will experience the benefit of open source solutions driven by innovative market leaders now and in the future. ” About Red Hat, Inc. Red Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

Read More

VIRTUAL DESKTOP STRATEGIES

VyOS Gets Featured as a Fast Mover and a Challenger in GigaOm Radar for Network Operating Systems

VyOS Networks Corporatio | September 21, 2022

VyOS Networks Corporation, the company that develops the VyOS Network Platform and provides support services, announced today that it was featured as a challenger and a fast mover in GigaOm Radar for network operating systems reports for SMB, enterprise and cloud/managed service provider segments. The full report is available to everyone interested upon request. "We are proud to get featured in GigaOm Radar reports this year again. We are glad that our effort to provide our customers with a stable and feature-rich platform is recognized," Yuriy Andamasov, VyOS Networks Corporation CEO "There is more to VyOS than market reports cover, however. GigaOm Radar makes a comparison of products where their use cases overlap. Still, VyOS also covers unique use cases that other vendors don't, such as custom embedded hardware support and modification of VyOS itself," adds Daniil Baturin, VyOS Networks Corporation CTO. About VyOS VyOS is an open-source network operating system. Its slogan is "a universal router" because it supports multiple deployment scenarios and roles: bare-metal hardware from small boards to large servers, all popular virtualization platforms including VMware, KVM, and Microsoft Hyper-V, and multi-cloud support for major hyperscalers like Amazon Web Services, Microsoft Azure, Google Cloud and Oracle Cloud Infrastructure. VyOS supports multiple dynamic routing protocols via FRRouting, various VPN protocols, and other network routing and security features, available through a unified stateful CLI and an HTTP API for management automation. About VyOS Networks Corporation VyOS Networks Corporation is a company started by the VyOS open-source project founders and maintainers to provide services for it and ensure sustainable development.

Read More

Spotlight

The Virtualization Station is composed of a QNAP NAS and Virtualization technology and turns the QNAP NAS into an appliance server. The Virtualization Station provides you to create or import/export virtual machines on QNAP Turbo NAS and supports different operating systems of virtual machines such as Windows, Linux and UNIX.

Resources