27-year-old Search Marketing CEO Lands 13-stop U.S. Speaking Tour

Digital Summit, the largest conference series in the digital marketing industry, has invited Mehrguth to share his unique presentation, 5 Data-Validated Tactics to Increase the Experienced Marketer’s Qualified Lead Volume (…and 3 Tactics Guaranteed to Fail) with their audiences across the nation.

Spotlight

Decypher Technologies, Inc

We know how frustrating technology can be, and our friendly staff is here to help. With Apple, Microsoft, Cisco, and other certified technicians, we have the expertise to take the complexity out of your technology. We provide prompt same-day service, and when you call during business hours you’ll speak to a person, not a recording. Have a problem after hours? We always have a tech on call to offer emergency support.

OTHER ARTICLES
Virtual Desktop Tools, Virtual Desktop Strategies

VM Applications for Software Development and Secure Testing

Article | June 8, 2023

Contents 1. Introduction 2. Software Development and Secure Testing 3. Using VMs in Software Development and Secure Testing 4. Conclusion 1. Introduction “Testing is an infinite process of comparing the invisible to the ambiguous in order to avoid the unthinkable happening to the anonymous.” —James Bach. Testing software is crucial for identifying and fixing security vulnerabilities. However, meeting quality standards for functionality and performance does not guarantee security. Thus, software testing nowadays is a must to identify and address application security vulnerabilities to maintain the following: Security of data history, databases, information, and servers Customers’ integrity and trust Web application protection from future attacks VMs provide a flexible and isolated environment for software development and security testing. They offer easy replication of complex configurations and testing scenarios, allowing efficient issue resolution. VMs also provide secure testing by isolating applications from the host system and enabling a reset to a previous state. In addition, they facilitate DevOps practices and streamline the development workflow. 2. Software Development and Secure Testing Software Secure Testing: The Approach The following approaches must be considered while preparing and planning for security tests: Architecture Study and Analysis: Understand whether the software meets the necessary requirements. Threat Classification: List all potential threats and risk factors that must be tested. Test Planning: Run the tests based on the identified threats, vulnerabilities, and security risks. Testing Tool Identification: For software security testing tools for web applications, the developer must identify the relevant security tools to test the software for specific use cases. Test-Case Execution: After performing a security test, the developer should fix it using any suitable open-source code or manually. Reports: Prepare a detailed test report of the security tests performed, containing a list of the vulnerabilities, threats, and issues resolved and the ones that are still pending. Ensuring the security of an application that handles essential functions is paramount. This may involve safeguarding databases against malicious attacks or implementing fraud detection mechanisms for incoming leads before integrating them into the platform. Maintaining security is crucial throughout the software development life cycle (SDLC) and must be at the forefront of developers' minds while executing the software's requirements. With consistent effort, the SDLC pipeline addresses security issues before deployment, reducing the risk of discovering application vulnerabilities while minimizing the damage they could cause. A secure SDLC makes developers responsible for critical security. Developers need to be aware of potential security concerns at each step of the process. This requires integrating security into the SDLC in ways that were not needed before. As anyone can potentially access source code, coding with potential vulnerabilities in mind is essential. As such, having a robust and secure SDLC process is critical to ensuring applications are not subject to attacks by hackers. 3. Using VMs in Software Development and Secure Testing: Snapshotting: Snapshotting allows developers to capture a VM's state at a specific point in time and restore it later. This feature is helpful for debugging and enables developers to roll back to a previous state when an error occurs. A virtual machine provides several operations for creating and managing snapshots and snapshot chains. These operations let users create snapshots, revert to any snapshots in the chain, and remove snapshots. In addition, extensive snapshot trees can be created to streamline the flow. Virtual Networking: It allows virtual machines to be connected to virtual networks that simulate complex network topologies, allowing developers to test their applications in different network environments. This allows expanding data centers to cover multiple physical locations, gaining access to a plethora of more efficient options. This empowers them to effortlessly modify the network as per changing requirements without any additional hardware. Moreover, providing the network for specific applications and needs offers greater flexibility. Additionally, it enables workloads to be moved seamlessly across the network infrastructure without compromising on service, security, or availability. Resource Allocation: VMs can be configured with specific resource allocations such as CPU, RAM, and storage, allowing developers to test their applications under different resource constraints. Maintaining a 1:1 ratio between the virtual machine processor and its host or core is highly recommended. It's crucial to refrain from over-subscribing virtual machine processors to a single core, as this could lead to stalled or delayed events, causing significant frustration and dissatisfaction among users. However, it is essential to acknowledge that IT administrators sometimes overallocate virtual machine processors. In such cases, a practical approach is to start with a 2:1 ratio and gradually move towards 4:1, 8:1, 12:1, and so on while bringing virtual allocation into IT infrastructure. This approach ensures a safe and seamless transition towards optimized virtual resource allocation. Containerization within VMs: Containerization within VMs provides an additional layer of isolation and security for applications. Enterprises are finding new use cases for VMs to utilize their in-house and cloud infrastructure to support heavy-duty application and networking workloads. This will also have a positive impact on the environment. DevOps teams use containerization with virtualization to improve software development flexibility. Containers allow multiple apps to run in one container with the necessary components, such as code, system tools, and libraries. For complex applications, both virtual machines and containers are used together. However, while containers are used for the front-end and middleware, VMs are used for the back-end. VM Templates: VM templates are pre-configured virtual machines that can be used as a base for creating new virtual machines, making it easier to set up development and testing environments. A VM template is an image of a virtual machine that serves as a master copy. It includes VM disks, virtual devices, and settings. By using a VM template, cloning a virtual machine multiple times can be achieved. When you clone a VM from a template, the clones are independent and not linked to the template. VM templates are handy when a large number of similar VMs need to be deployed. They preserve VM consistency. To edit a template, convert it to a VM, make the necessary changes, and then convert the edited VM back into a new template. Remote Access: VMs can be accessed remotely, allowing developers and testers to collaborate more effectively from anywhere worldwide. To manage a virtual machine, follow these steps: enable remote access, connect to the virtual machine, and then access the VNC or serial console. Once connected, full permission to manage the virtual machine is granted with the user's approval. Remote access provides a secure way to access VMs, as connections can be encrypted and authenticated to prevent unauthorized access. Additionally, remote access allows for easier management of VMs, as administrators can monitor and control virtual machines from a central location. DevOps Integration: DevOps is a collection of practices, principles, and tools that allow a team to release software quickly and efficiently. Virtualization is vital in DevOps when developing intricate cloud, API, and SOA systems. Virtual machines enable teams to simulate environments for creating, testing, and launching code, ultimately preserving computing resources. While commencing a bug search at the API layer, teams find that virtual machines are suitable for test-driven development (TDD). Virtualization providers handle updates, freeing up DevOps teams, to focus on other areas and increasing productivity by 50 –60%. In addition, VMs allow for simultaneous testing of multiple release and patch levels, improving product compatibility and interoperability. 4. Conclusion The outlook for virtual machine applications is highly promising in the development and testing fields. With the increasing complexity of development and testing processes, VMs can significantly simplify and streamline these operations. In the future, VMs are expected to become even more versatile and potent, providing developers and testers with a broader range of tools and capabilities to facilitate the development process. One potential future development is integrating machine learning and artificial intelligence into VMs. This would enable VMs to automate various tasks, optimize the allocation of resources, and generate recommendations based on performance data. Moreover, VMs may become more agile and lightweight, allowing developers and testers to spin up and spin down instances with greater efficiency. The future of VM applications for software development and security testing looks bright, with continued innovation and development expected to provide developers and testers with even more powerful and flexible tools to improve the software development process.

Read More
Virtual Desktop Tools, Server Hypervisors

Virtual Machine Security Risks and Mitigation in Cloud Computing

Article | June 8, 2023

Analyzing risks and implementing advanced mitigation strategies: Safeguard critical data, fortify defenses, and stay ahead of emerging threats in the dynamic realm of virtual machines in cloud. Contents 1. Introduction 2. 10 Security Risks Associated with Virtual Machines in Cloud Computing 3. Best Practices to Avoid Security Compromise 4. Conclusion 1. Introduction Cloud computing has revolutionized the way businesses operate by providing flexible, scalable, and cost-effective infrastructure for running applications and services. Virtual machines (VMs) are a key component of cloud computing, allowing multiple virtual machines to run on a single physical machine. However, the use of virtual machines in cloud computing introduces new security risks that need to be addressed to ensure the confidentiality, integrity, and availability of data and services. Effective VM security in the cloud requires a comprehensive approach that involves cloud providers and users working together to identify and address potential virtual machine security threats. By implementing these best practices and maintaining a focus on security, cloud computing can provide a secure and reliable platform for businesses to run their applications and services. 2. 10 Security Risks Associated with Virtual Machines in Cloud Computing Denial of Service (DoS) attacks: These are attacks that aim to disrupt the availability of a VM or the entire cloud infrastructure by overwhelming the system with traffic or resource requests. Insecure APIs: Cloud providers often expose APIs that allow users to manage their VMs. If these APIs are not properly secured, attackers can exploit them to gain unauthorized access to VMs or manipulate their configurations. Data leakage: Virtual machines can store sensitive data such as customer information or intellectual property. If not secured, this data can be exposed to unauthorized access or leakage. Shared resources: VMs in cloud environments often share physical resources such as memory, CPU, and network interfaces. If these resources are not isolated, a compromised VM can potentially affect the security and performance of other VMs running on the same physical host. Lack of visibility: Virtual machines in cloud environments can be more difficult to monitor than physical machines. This can make it harder to detect security incidents or anomalous behavior. Insufficient logging and auditing: If cloud providers do not implement appropriate logging and auditing mechanisms, it can be difficult to determine the cause and scope of a security incident. VM escape: This is when an attacker gains access to the hypervisor layer and then escapes into the host operating system or other VMs running on the same physical host. Side-channel attacks: This is when an attacker exploits the physical characteristics of the hardware to gain unauthorized access to a VM. Examples of side-channel attacks include timing attacks, power analysis attacks, and electromagnetic attacks. Malware attacks: VMs can be infected with malware, just like physical machines. Malware can be used to steal data, launch attacks on other VMs or systems, or disrupt the functioning of the VM. Insider threats: Malicious insiders can exploit their access to VMs to steal data, modify configurations, or launch attacks. 3. Best Practices to Avoid Security Compromise To mitigate these risks, there are several virtual machine security guidelines that cloud service providers and users can follow: Keep software up-to-date: Regularly updating software and security patches for virtual machines is crucial in preventing known vulnerabilities from being exploited by hackers. Software updates fix bugs and security flaws that could allow unauthorized access, data breaches, or malware attacks. According to a study, 60% of data breaches are caused by vulnerabilities that were not patched or updated in a timely manner.(Source: Ponemon Institute) Use secure hypervisors: A hypervisor is a software layer that enables multiple virtual machines to run on a single physical server. Secure hypervisors are designed to prevent unauthorized access to virtual machines and protect them from potential security threats. When choosing a hypervisor, it is important to select one that has undergone rigorous testing and meets industry standards for security. In 2018, a group of researchers discovered a new type of attack called "Foreshadow" (also known as L1 Terminal Fault). The attack exploits vulnerabilities in Intel processors and can be used to steal sensitive data from virtual machines running on the same physical host. Secure hypervisors that have implemented hardware-based security features can provide protection against Foreshadow and similar attacks. (Source: Foreshadow) Implement strong access controls: Access control is the practice of restricting access to virtual machines to authorized users. Multi-factor authentication adds an extra layer of security by requiring users to provide more than one type of authentication method before accessing VMs. Strong access controls limit the risk of unauthorized access and can help prevent data breaches. According to a survey, organizations that implemented multi-factor authentication saw a 98% reduction in the risk of phishing-related account breaches. (Source: Duo Security) Monitor VMs for anomalous behavior: Monitoring virtual machines for unusual or unexpected behavior is an essential security practice. This includes monitoring network traffic, processes running on the VM, and other metrics that can help detect potential security incidents. By monitoring VMs, security teams can detect and respond to security threats before they can cause damage. A study found that 90% of organizations that implemented a virtualized environment experienced security benefits, such as improved visibility into security threats and faster incident response times. (Source: VMware) Use Encryption: Encryption is the process of encoding information in such a way that only authorized parties can access it. Encrypting data both in transit and at rest protects it from interception or theft by hackers. This can be achieved using industry-standard encryption protocols and technologies. According to a report by, the average cost of a data breach in 2020 was $3.86 million. The report also found that organizations that implemented encryption had a lower average cost of a data breach compared to those that did not (Source: IBM) Segregate VMs: Segregating virtual machines is the practice of keeping sensitive VMs separate from less sensitive ones. This reduces the risk of lateral movement, which is when a hacker gains access to one VM and uses it as a stepping stone to gain access to other VMs in the same environment. Segregating VMs helps to minimize the risk of data breaches and limit the potential impact of a security incident. A study found that organizations that implemented a virtualized environment without adequate segregation and access controls were more vulnerable to VM security breaches and data loss. (Source: Ponemon Institute) Regularly Back-up VMs: Regularly backing up virtual machines is a critical security practice that can help mitigate the impact of malware attacks, system failures, or other security incidents. Backups should be stored securely and tested regularly to ensure that they can be restored quickly in the event of a security incident. A survey conducted found that 42% of organizations experienced a data loss event in 2020 with the most common cause being accidental deletion by an employee (29%). (Source: Veeam) 4. Conclusion The complexity of cloud environments and the shared responsibility model for security require organizations to adopt a comprehensive security approach that spans multiple infrastructure layers, from the physical to the application layer. The future of virtual machine security concern in cloud computing will require continued innovation and adaptation to new threats and vulnerabilities. As a result, organizations must remain vigilant and proactive in their security efforts, leveraging the latest technologies and best practices to protect their virtual machines, the sensitive data and resources they contain.

Read More
Server Hypervisors

ProtonVPN iOS app now supports the OpenVPN protocol

Article | September 9, 2022

Your ProtonVPN iOS app is now better equipped to fight censorship and offers more flexible connection options with the launch of OpenVPN for iOS. The OpenVPN protocol is one of the best VPN protocols because of its flexibility, security, and because it is more resistant to blocks. You now have the option to switch between the faster IKEv2 protocol and the more stable and censorship-resistant OpenVPN protocol.

Read More
VMware

VMware Tanzu Kubernetes Grid Integrated: A Year in Review

Article | December 14, 2021

The modern application world is advancing at an unprecedented rate. However, the new possibilities these transformations make available don’t come without complexities. IT teams often find themselves under pressure to keep up with the speed of innovation. That’s why VMware provides a production-ready container platform for customers that aligns to upstream Kubernetes, VMware Tanzu Kubernetes Grid Integrated (formerly known as VMware Enterprise PKS). By working with VMware, customers can move at the speed their businesses demand without the headache of trying to run their operations alone. Our offerings help customers stay current with the open source community's innovations while having access to the support they need to move forward confidently. Many changes have been made to Tanzu Kubernetes Grid Integrated edition over the past year that are designed to help customers keep up with Kubernetes advancements, move faster, and enhance security. Kubernetes updates The latest version, Tanzu Kubernetes Grid Integrated 1.13, bumped to Kubernetes version 1.22 and removed beta APIs in favor of stable APIs that have since evolved from the betas. Over time, some APIs will evolve. Beta APIs typically evolve more often than stable APIs and should therefore be checked before updates occur. The APIs listed below will not be served with v1.22 as they have been replaced by more stable API versions: Beta versions of the ValidatingWebhookConfiguration and MutatingWebhookConfiguration API (the admissionregistration.k8s.io/v1beta1 API versions) The beta CustomResourceDefinition API (apiextensions.k8s.io/v1beta1) The beta APIService API (apiregistration.k8s.io/v1beta1) The beta TokenReview API (authentication.k8s.io/v1beta1) Beta API versions of SubjectAccessReview, LocalSubjectAccessReview, SelfSubjectAccessReview (API versions from authorization.k8s.io/v1beta1) The beta CertificateSigningRequest API (certificates.k8s.io/v1beta1) The beta Lease API (coordination.k8s.io/v1beta1) All beta Ingress APIs (the extensions/v1beta1 and networking.k8s.io/v1beta1 API versions) Containerd support Tanzu Kubernetes Grid Integrated helps customers eliminate lengthy deployment and management processes with on-demand provisioning, scaling, patching, and updating of Kubernetes clusters. To stay in alignment with the Kubernetes community, Containerd will be used as the default container runtime, although Docker can still be selected using the command-line interface (CLI) if needed. Networking Several updates have been made in regards to networking as well including support of Antrea and NSX-T enhancements. Antrea support With Tanzu Kubernetes Grid Integrated version 1.10 and later, customers can leverage Antrea on install or upgrade to use Kubernetes network policies. This enables enterprises to get the best of both worlds: access to the latest innovation from Antrea and world-class support from VMware. NSX-T enhancements NSX-T was integrated with Tanzu Kubernetes Grid Integrated to simplify container networking and increase security. This has been enhanced so customers can now choose the policy API as an option on a fresh installation of Tanzu Kubernetes Grid Integrated. This means that users will have access to new features available only through NSX-T policy API. This feature is currently in beta. In addition, more NSX-T and NSX Container Plug-in (NCP) configuration is possible through the network profiles. This operator command provides the benefit of being able to set configurations through the CLI, and this is persistent across lifecycle events. Storage enhancements We’ve made storage operations in our customers’ container native environments easier, too. Customers were seeking a simpler and more secure way to manage Container Storage Interface (CSI), and we introduced automatic installation of the vSphere CSI driver as a BOSH process beginning with Tanzu Kubernetes Grid Integrated 1.11. Also, as VCP will be deprecated, customers are advised to use the CSI driver. VCP-to-CSI migration is a part of Tanzu Kubernetes Grid Integrated 1.12 and is designed to help customers move forward faster. Enhanced security Implementing new technologies provides users with new capabilities, but it can also lead to new security vulnerabilities if not done correctly. VMware’s goal is to help customers move forward with ease and the confidence of knowing that enhancements don’t compromise core security needs. CIS benchmarks This year, Tanzu Kubernetes Grid Integrated continued to see improvements that help meet today’s high security standards. Meeting the Center for Internet Security (CIS) benchmarks standards is vital for Tanzu Kubernetes Grid Integrated. In recent Tanzu Kubernetes Grid Integrated releases, a few Kubernetes-related settings have been adjusted to ensure compliance with CIS requirements: Kube-apiserver with --kubelet-certificate-authority settings (v1.12) Kube-apiserver with --authorization-mode argument includes Node (v1.12) Kube-apiserver with proper --audit-log-maxage argument (v1.13) Kube-apiserver with proper --audit-log-maxbackup argument (v1.13) Kube-apiserver with proper --audit-log-maxsize argument (v1.13) Certificate rotations Tanzu Kubernetes Grid Integrated secures all communication between its control plane components and the Kubernetes clusters it manages, using TLS validated by certificates. The certificate rotations have been simplified in recent releases. Customers can now list and simply update certificates on a cluster-by-cluster basis through the “tkgi rotate-certificates” command. The multistep, manual process was replaced with a single CLI command to rotate NSX-T certificates (available since Tanzu Kubernetes Grid Integrated 1.10) and cluster-by-cluster certificates (available since Tanzu Kubernetes Grid Integrated 1.12). Hardening of images Tanzu Kubernetes Grid Integrated keeps OS images, container base images, and software library versions updated to remediate the CVEs reported by customers and in the industry. It also continues to use the latest Ubuntu Xenial Stemcell latest versions for node virtual machines. With recent releases and patch versions, the version of dockerd, containerd, runc, telegraf, nfs-utils had been bumped to the latest stable and secure versions as well. By using Harbor as a private registry management service, customers could also leverage the built-in vulnerability scan features to discover the application images CVEs. VMware is dedicated to supporting customers with production readiness by enhancing the user experience. Tanzu Kubernetes Grid Integrated Edition has stayed up to date with the Kubernetes community and provides customers with the support and resources they need to innovate rapidly.

Read More

Spotlight

Decypher Technologies, Inc

We know how frustrating technology can be, and our friendly staff is here to help. With Apple, Microsoft, Cisco, and other certified technicians, we have the expertise to take the complexity out of your technology. We provide prompt same-day service, and when you call during business hours you’ll speak to a person, not a recording. Have a problem after hours? We always have a tech on call to offer emergency support.

Related News

Global Search Marketing Agency, Directive, Announces Complete Rebranding

Directive Consulting | December 05, 2018

Directive, the leading B2B and enterprise search marketing agency, announced today the launch of their new branding to reflect their evolution into a global search marketing agency. Directive’s unique approach to search marketing has positioned the company as the agency of choice for leading B2B and enterprise companies since 2014 with a portfolio comprised of 90% in the B2B space. With their extensive rebranding efforts, the company continues to offer premier SEO, PPC, CRO, content marketing and paid social services that B2B and enterprises need to scale their business. Additionally, Directive continues to invest further in employee well-being, marketing technologies and superior support for clients. “Our rebranding does not impact our services, operations or our market, as we have been working with leaders in the B2B space for some time; however, our identity now reflects and matches that,” stated Hannah Mans, Directive’s director of marketing. “This milestone is the first of many as we work towards our vision to be the largest global B2B search agency by the end of 2020.” The rebranding includes a top-to-bottom redesign of the company’s website and logo to better resonate with current and potential clients.

Read More

Directive Ranks #1 in Clutch’s Top B2B Marketing Service Providers

Directive Consulting | March 06, 2019

Directive, the leading B2B and enterprise search marketing agency, has recently been honored as the number one B2B marketing and advertising service provider in Los Angeles, according to Clutch. Clutch is a B2B research, ratings and reviews site that identifies leading IT and marketing service providers and software. Recently, Clutch has announced over 260 B2B companies that embody industry leadership in Los Angeles based on their market presence, respective expertise, verified client feedback, and their past and current clientele. Directive was awarded the leading spot on the advertising and marketing list. “We are thrilled for this opportunity to be recognized as the go-to service provider for B2B marketing,” said CEO and Co-founder Garrett Mehrguth. “This is a testament to our team’s dedication and unwavering focus on excellence and to deliver premier services to our clients.” Since its establishment in 2014, Mehrguth has led Directive in its expansion of five global offices including Orange County, California; Austin, Texas; Los Angeles; New York City; and London. Directive has increased by a year-over-year growth rate of 300 percent, and now is celebrating as the number one B2B marketing and advertising service provider in Los Angeles.

Read More

27-year-old Search Marketing CEO Lands 13-stop U.S. Speaking Tour

Directive Consulting | May 29, 2019

Garrett Mehrguth, the CEO and co-founder of the B2B and enterprise search marketing agency, Directive, recently was selected to speak at 13 stops of the Digital Summit tour. Digital Summit, the largest conference series in the digital marketing industry, has invited Mehrguth to share his unique presentation, "5 Data-Validated Tactics to Increase the Experienced Marketer's Qualified Lead Volume (...and 3 Tactics That Are Guaranteed to Fail)" with their audiences across the nation. Mehrguth will continue to discuss how B2B and enterprise marketers can advance their digital "discoverability" and take control of their residency on search engine results pages. This approach has catalyzed Directive's growth by 300 percent year-over-year. It also is the foundation that the firm's strategies are built on, which is utilized across their portfolio of over 75 clients. "I've had the pleasure of working with Garrett over the past year, as he has proven to be a stand-out speaker in our Digital Summit Series," said Leah Harris, content and product strategist for Digital Summit. "We curate 20 marketing conferences with over 1,000 speakers in total, and Garrett consistently engages the crowd with his expertise and surveys in the top 20 percent of speakers."

Read More

Global Search Marketing Agency, Directive, Announces Complete Rebranding

Directive Consulting | December 05, 2018

Directive, the leading B2B and enterprise search marketing agency, announced today the launch of their new branding to reflect their evolution into a global search marketing agency. Directive’s unique approach to search marketing has positioned the company as the agency of choice for leading B2B and enterprise companies since 2014 with a portfolio comprised of 90% in the B2B space. With their extensive rebranding efforts, the company continues to offer premier SEO, PPC, CRO, content marketing and paid social services that B2B and enterprises need to scale their business. Additionally, Directive continues to invest further in employee well-being, marketing technologies and superior support for clients. “Our rebranding does not impact our services, operations or our market, as we have been working with leaders in the B2B space for some time; however, our identity now reflects and matches that,” stated Hannah Mans, Directive’s director of marketing. “This milestone is the first of many as we work towards our vision to be the largest global B2B search agency by the end of 2020.” The rebranding includes a top-to-bottom redesign of the company’s website and logo to better resonate with current and potential clients.

Read More

Directive Ranks #1 in Clutch’s Top B2B Marketing Service Providers

Directive Consulting | March 06, 2019

Directive, the leading B2B and enterprise search marketing agency, has recently been honored as the number one B2B marketing and advertising service provider in Los Angeles, according to Clutch. Clutch is a B2B research, ratings and reviews site that identifies leading IT and marketing service providers and software. Recently, Clutch has announced over 260 B2B companies that embody industry leadership in Los Angeles based on their market presence, respective expertise, verified client feedback, and their past and current clientele. Directive was awarded the leading spot on the advertising and marketing list. “We are thrilled for this opportunity to be recognized as the go-to service provider for B2B marketing,” said CEO and Co-founder Garrett Mehrguth. “This is a testament to our team’s dedication and unwavering focus on excellence and to deliver premier services to our clients.” Since its establishment in 2014, Mehrguth has led Directive in its expansion of five global offices including Orange County, California; Austin, Texas; Los Angeles; New York City; and London. Directive has increased by a year-over-year growth rate of 300 percent, and now is celebrating as the number one B2B marketing and advertising service provider in Los Angeles.

Read More

27-year-old Search Marketing CEO Lands 13-stop U.S. Speaking Tour

Directive Consulting | May 29, 2019

Garrett Mehrguth, the CEO and co-founder of the B2B and enterprise search marketing agency, Directive, recently was selected to speak at 13 stops of the Digital Summit tour. Digital Summit, the largest conference series in the digital marketing industry, has invited Mehrguth to share his unique presentation, "5 Data-Validated Tactics to Increase the Experienced Marketer's Qualified Lead Volume (...and 3 Tactics That Are Guaranteed to Fail)" with their audiences across the nation. Mehrguth will continue to discuss how B2B and enterprise marketers can advance their digital "discoverability" and take control of their residency on search engine results pages. This approach has catalyzed Directive's growth by 300 percent year-over-year. It also is the foundation that the firm's strategies are built on, which is utilized across their portfolio of over 75 clients. "I've had the pleasure of working with Garrett over the past year, as he has proven to be a stand-out speaker in our Digital Summit Series," said Leah Harris, content and product strategist for Digital Summit. "We curate 20 marketing conferences with over 1,000 speakers in total, and Garrett consistently engages the crowd with his expertise and surveys in the top 20 percent of speakers."

Read More

Events