Headless

Headless WordPress for High-Demand, High Availability Sites

A Word From MIDTC CTO Nick DeLorenzo on Headless WordPress Sites:

One of the biggest challenges facing high-traffic WordPress sites is that if the plugin stack and caching configuration are not well-optimized, they tend to experience continuous performance issues. Additionally, it's extremely common for conventional WordPress sites to experience high load times on the admin when there are many concurrent users in the system generating content. This is something that is inevitably encountered by media sites where conventional WordPress is the primary system of record for content management.

In these situations, I typically recommend that a client utilize what is known as a "Headless" WordPress configuration. In a headless configuration, the WordPress backend is separated from the front-end that renders the site - what this means is that all of the database and CPU resources for the backend are distinct from - and unimpacted by - frontend utilization or traffic - and vice versa.

This has a number of advantages, the most immediate of which is performance. It essentially provides maximum potential performance for the site admin, and the frontend does not have to actively query database resources which may be consumed by backend processes to render content. As a result, every aspect of site performance can be improved.

Another advantage is scalability: you can scale database resources for the backend without needing to scale the webhead or alter configurations on the frontend, or scale frontend resources for greater bandwidth while conserving database resources. For large organizations, this can result in substantial savings.

Headless also secures the backend and database from direct attack vectors commonly found in a WordPress instance, by obfuscating vulnerabilities, and making it possible to lock down APIs.

Finally, it makes it possible to alter feature sets on the admin without impacting the frontend, or implement redesigns that leverage the same backend functionalities. All of these are ideal for large scale organizations.

There are several disadvantages, however:

It's extremely important that any design and functional requirements for headless sites be spelled out well ahead of time. Any feature of a headless implementation needs to be meticulously pre-planned to ensure that what you input on the admin is reflected on the frontend. This means you can't simply slap a plugin in on your headless implementation and suddenly add a new functionality. As a result, new feature sets commonly require development resources.

Additionally, maintenance of headless sites can be complicated for teams that do not have core competencies in either WordPress or the various JavaScript frameworks (for example React, Vue or Angular) commonly used on WordPress sites. If the development teams do not understand the principles or potential issues with headless hosting, it can be challenging to isolate and quickly react to issues.

Finally, it's important that the development teams, users, and executives responsible for running these sites not be siloed, and have a very clear understanding of - again, the requirements, capabilities, advantages, and limitations of headless sites.

Headless sites are ideal for large-scale media organizations and address many of the challenges they encounter with web hosting. But it's important that the organizations in question be prepared to operate these sites, particularly if they have a direct hand in the maintenance of these sites. MIDTC has expertise handling headless WordPress implementations, and we commonly recommend them for organizations that have these unique challenges as they can save a substantial amount of money and offer superior performance.


Bot Traffic

Bot Traffic: The Good, The Bad, And The Ugly

"Bot" Traffic on your website is almost inevitable, and it's not all bad. You want search bots crawling your site. Social media bots will scrape content for articles you post online.
But there are a wide variety of bots. Some are good, some are bad, some are "neutral", and it can depend on what services you use. Even good bots can behave badly, for example if they scrape your site too frequently and your server doesn't have the resources to handle the load. Bot traffic needs to be curated, both good and bad, and there are a variety of ways to do this. Almost all of our customers have protections in place to block things like AI bots or bad bots, and it's something we recommend as a matter of course.

So What is a Bot?

A bot, short for “robot,” is a software application designed to perform automated tasks over the internet. These tasks can range from simple, repetitive actions to more complex operations. Bots can be broadly categorized into two types: good bots and bad bots.

Types of Bots

  1. Good Bots
    • Search Engine Bots: These bots, like Google’s web crawlers, index content on websites to improve search engine results.
    • Chatbots: These bots interact with users, providing customer service or answering queries in real-time.
    • Monitoring Bots: Used for tracking website performance, uptime, and other metrics.
  2. Bad Bots
    • Scraping Bots: These bots extract data from websites without permission, often for competitive analysis or content theft.
    • Spam Bots: They post spam content or send unsolicited messages, often cluttering forums and comment sections.
    • DDoS Bots: Used in Distributed Denial of Service (DDoS) attacks to overwhelm a website’s server, causing it to crash.

Here are some common problems associated with bot traffic:

1. Skewed Analytics

Bots can significantly distort website analytics. They can inflate page views, bounce rates, and other metrics, making it difficult to accurately assess human visitor behavior. This can lead to misguided marketing strategies and poor decision-making.

2. Server Overload

High volumes of bot traffic can overwhelm servers, leading to slow load times or even crashes. This not only affects user experience but can also impact search engine rankings, as site speed is a crucial factor in SEO.

3. Security Risks

Malicious bots can pose serious security threats. They can be used for activities such as scraping content, launching DDoS attacks, and attempting to exploit vulnerabilities. This can lead to data breaches, loss of sensitive information, and damage to the website’s reputation.

4. Ad Fraud

Bots can generate fake clicks on ads, leading to significant financial losses for advertisers. This type of fraud can also damage relationships with advertisers and reduce the overall effectiveness of ad campaigns.

5. Content Scraping

Bots can scrape content from websites, which can then be republished elsewhere without permission. This not only violates copyright laws but can also harm SEO efforts by creating duplicate content issues.

6. Form Spam

Bots often target forms on websites, submitting fake information that can clutter databases and make it difficult to manage legitimate user data. This can also lead to increased costs for data storage and processing.

How to Mitigate Bot Traffic

To protect your website from the negative impacts of bot traffic, consider implementing the following measures:

  • Use CAPTCHA: Implementing CAPTCHA can help distinguish between human users and bots.
  • Monitor Traffic Patterns: Regularly analyze traffic patterns to identify and block suspicious activity.
  • Employ Bot Management Solutions: Use specialized software to detect and mitigate bot traffic.
  • Update Security Protocols: Ensure your website’s security measures are up-to-date to prevent exploitation by malicious bots.

By understanding and addressing these common issues, you can better protect your website and ensure a smoother experience for your legitimate users.

In a forthcoming article we can discuss other methods of mitigating bot traffic.


Site Speed

Why Is Site Speed Important For Search Rankings? A Rundown.

The latest updates to Google's algorithm have increasingly stressed Google's Core Web Vitals and site speed when ranking a site. (You can assess your site's performance using this tool). Sites with otherwise good traffic and content have been harshly penalized in some cases for having poor site speed or optimization. Media and E-Commerce sites are particularly impacted by this, given the amount of advertising, scripts, and images or video in use on these sites, all of which negatively impact site performance unless optimized carefully.

Below is an example of a site optimized by MIDTC for performance. In all such cases, search ranking, and therefore traffic, improved dramatically.
Page Speed Optimization

Website performance is crucial for Google search rankings for several reasons:

  1. User Experience: Fast-loading websites provide a better user experience. Users are more likely to stay on a site that loads quickly and navigate through its pages. Google aims to deliver the best possible experience to its users, so it favors websites that load quickly1.
  2. Core Web Vitals: Google uses Core Web Vitals as part of its ranking criteria. These metrics measure aspects of web performance, such as loading speed, interactivity, and visual stability. Websites that perform well on these metrics are more likely to rank higher1.
  3. Bounce Rate: Slow websites tend to have higher bounce rates, meaning users leave the site quickly without interacting with it. A high bounce rate can negatively impact your search rankings because it signals to Google that users are not finding the site useful2.
  4. Mobile Performance: With the increasing use of mobile devices, Google has placed a significant emphasis on mobile performance. Websites that are optimized for mobile devices and load quickly on them are more likely to rank higher in search results3.
  5. SEO Benefits: Faster websites can crawl more efficiently by search engines, leading to better indexing and potentially higher rankings. This efficiency is particularly important for large websites with many pages4.

Improving your website’s performance can lead to better search rankings, more traffic, and a better overall user experience. If you need more detailed information, you might find this article helpful1.

MIDTC has been successfully helping clients improve their site performance through a mix of audits, optimizations, and careful implementation of technology. Typically we want to see a customer pass core web vitals as outlined by Google Page Speed, and have Mobile and Desktop speed indices in the high 80s or 90s before we consider an optimization successful.


Plugins

Wordpress Plugins: How Many is Too Many? - The argument for a lightweight site.

One of the most common performance issues we encounter with sites is frontend or admin/backend slowness. More often than not, this isn't caused by resource bottlenecks on the server (although it can be) - caching counts for a lot these days. It's usually because of a tendency to throw plugins at the problem instead of looking into other solutions or investigating what you already have on hand. Once you've built up a lot of plugins and have a lot of content or processes depending on them, it can be very difficult to reverse the process.

Here are some key pitfalls:

  1. Performance Issues: Each plugin adds its own code to your site, which can slow down loading times and overall performance. This is especially problematic if the plugins are poorly coded12.
  2. Security Vulnerabilities: More plugins mean more potential entry points for hackers. If a plugin is not regularly updated, it can become a security risk13.
  3. Compatibility Problems: Plugins can sometimes conflict with each other or with your WordPress theme, leading to errors and site crashes23.
  4. Maintenance Overhead: Managing a large number of plugins can be time-consuming. Each plugin requires updates and monitoring to ensure it continues to function correctly12.
  5. Increased Database Requests: Plugins often increase the number of database queries, which can slow down your site, especially if you’re on shared hosting2.