Builders https://wpengine.com/builders/ Reimagining the way we build with WordPress. Mon, 23 Sep 2024 14:08:55 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 https://wpengine.com/builders/wp-content/uploads/2024/05/wp-engine-favicon-32.png Builders https://wpengine.com/builders/ 32 32 Enhanced Runtime Logs on WP Engine’s headless platform https://wpengine.com/builders/enhanced-runtime-logs-on-wp-engines-headless-platform/ https://wpengine.com/builders/enhanced-runtime-logs-on-wp-engines-headless-platform/#respond Sat, 21 Sep 2024 18:04:46 +0000 https://wpengine.com/builders/?p=31722 Diagnosing issues and optimizing performance is critical when building headless WordPress applications. WP engine’s new Enhanced Runtime Logs are designed to give developers deeper insights and make the debugging process […]

The post Enhanced Runtime Logs on WP Engine’s headless platform appeared first on Builders.

]]>
Diagnosing issues and optimizing performance is critical when building headless WordPress applications. WP engine’s new Enhanced Runtime Logs are designed to give developers deeper insights and make the debugging process easier.

What Are Runtime Logs?

Runtime logs capture real-time information about your application’s activity, errors, and custom log messages. They provide visibility into how your app operates at any given moment, allowing developers to track performance, monitor for errors, and understand user behavior.

How Runtime Logs Aid Debugging

Logs are essential for pinpointing issues, especially when things don’t go as expected in production. With real-time visibility into the app’s operations, you can:

  • Detect Errors: Get immediate feedback on what went wrong.
  • Monitor Performance: Keep an eye on memory usage, API calls, and other performance-related metrics.
  • Custom Logging: Add custom log messages to gain insights specific to your application’s unique workflows.

Benefits of WP Engine’s Enhanced Runtime Logs

WP engine’s enhancements to runtime logs improve the developer experience significantly:

  • Visual Log Activity Chart: Color-coded entries and deployment markers offer an intuitive way to understand log data. Developers can quickly correlate logs with code changes.

  • Advanced Filtering: Quickly filter error or info logs, and use full-text search to find specific issues or patterns within the logs.
  • Time-Based Filtering: Choose predefined or custom time windows to investigate issues and track performance. You can also view logs since the last build.
  • Improved Diagnostics: These features enable faster diagnosis, better optimization, and more effective debugging in production environments.

How to Access the Logs

Accessing these logs is easy:

  1. Navigate to the Logs tab in your WP engine headless platform account.
  2. Select the time period you want to inspect under the Runtime section.

WP engine’s runtime logs retain data for the past 24 hours, giving you up-to-date information to work with.

Why This Matters for Developers

These enhancements are all about efficiency. When debugging headless WordPress applications, the quicker you can identify and address issues, the better. WP Engine’s runtime logs allow developers to optimize performance, reduce downtime, and deploy with confidence.

With these tools, you’ll be better equipped to maintain smooth, reliable applications. The documentation of WP Engine provides more information on how to use these features.

As always, we’re stoked to hear your feedback and see what headless projects you’re building! Hit us up in our Discord!

The post Enhanced Runtime Logs on WP Engine’s headless platform appeared first on Builders.

]]>
https://wpengine.com/builders/enhanced-runtime-logs-on-wp-engines-headless-platform/feed/ 0
WP engine’s Node.js Edge Cache Purge Library for the headless WordPress platform https://wpengine.com/builders/wp-engines-node-js-edge-cache-purge-library-for-the-headless-wordpress-platform/ https://wpengine.com/builders/wp-engines-node-js-edge-cache-purge-library-for-the-headless-wordpress-platform/#respond Tue, 03 Sep 2024 22:50:47 +0000 https://wpengine.com/builders/?p=31705 Efficient caching is essential for maintaining performance in modern web applications, but keeping cached content up-to-date can be challenging.  In this article, I will discuss how WP Engine’s Edge Cache […]

The post WP engine’s Node.js Edge Cache Purge Library for the headless WordPress platform appeared first on Builders.

]]>
Efficient caching is essential for maintaining performance in modern web applications, but keeping cached content up-to-date can be challenging.  In this article, I will discuss how WP Engine’s Edge Cache Purge Library for Node.js addresses this by allowing targeted cache purging through specific paths or tags, rather than clearing the entire cache.

If you prefer the video format of this article, please access it here:

Prerequisites

Before reading this article, you should have the following prerequisites checked off:

  • Foundational knowledge of JavaScript frameworks
  • A WP engine headless WordPress platform account
  • Node.js

If you’re looking for a headless platform to develop on, you can get started with a WP Engine Headless WordPress sandbox site for free:

Understanding Edge Cache and Its Role in Headless WordPress

In order to understand the benefit of this feature by WP engine, let’s first define and discuss what edge cache is.

Edge caching refers to storing cached content at edge locations, which are servers distributed globally closer to users. This reduces latency by delivering content from the nearest server rather than the origin server, improving load times and overall user experience.

How It Differs from Other Caching Layers

You might ask yourself how it differs from other caching layers. In a headless WordPress setup using a JavaScript frontend framework and WPGraphQL, multiple caching layers exist:

  • Application-Level Caching: This involves caching GraphQL queries or API responses within the application. For example, in Next.js’s server-side rendering or incremental static regeneration. It focuses on reducing the need to repeatedly fetch data from the WordPress backend.
  • CDN Caching: Content Delivery Networks (CDNs) cache static assets like images, CSS, and JavaScript files at edge locations. This is similar to edge caching but focused on static resources.
  • Database Caching: WordPress can use database-level caching with object caches like Redis to speed up database queries and reduce server load.

Edge Cache vs. Other Caches

The main differences between edge cache versus other caching layers are as follows:

Edge Cache: Specifically stores whole pages or HTML responses at edge locations. It can be dynamically purged by paths or tags using tools like WP Engine’s Edge Cache Purge Library. This makes it highly efficient for frequently changing content, allowing rapid updates without waiting for other cache layers to expire.

Application and Database Caches: These are closer to the backend and primarily reduce server load by avoiding redundant data processing.

Benefits of Edge Cache in Headless WordPress

Here are the main benefits you get when using the edge cache library by WP engine:

  • Performance: Delivering cached content from locations near users significantly reduces latency.
  • Scalability: It handles high traffic efficiently without burdening the origin server.
  • Dynamic Purging: Allows for granular control over what gets purged and updated, ensuring content remains fresh without unnecessary full cache clears.

Setup and Installation With SvelteKit

In this example, let’s use SvelteKit (you can use any framework you want and it will work), a great frontend framework similar to Nuxt and Next.

  1. First, install and pull down the SvelteKit framework with npm create:
npm create svelte@latest

2. Once you have a SvelteKit app spun up, navigate into the directory of your project and install the edge cache library:

npm install @wpengine/edge-cache 

3. Now, ensure you have the necessary environment variables configured, such as authentication credentials and your WPGraphQL endpoint/WordPress endpoint.

4. In this example, let’s purge by path. Navigate to the src/routes folder. In the routes folder, create a folder called blog/[uri] . Once that is done, in the [uri] folder, create a +page.svelte file. You can drop this code block in or something similar to what you decide to use framework-wise:

<script>
  import { onMount } from 'svelte';
  
  let posts = [];
  let loading = true;

  async function fetchGraphQL(query = {}) {
    const queryParams = new URLSearchParams({ query }).toString();
    const response = await fetch(`${import.meta.env.VITE_GRAPHQL_ENDPOINT}?${queryParams}`, {
      method: 'GET',
      headers: {
        'Accept': 'application/json',
      },
    });

    if (!response.ok) {
      throw new Error(`Network error: ${response.statusText}`);
    }
    return response.json();
  }

  const GET_POSTS_QUERY = `
    query GetPosts {
      posts(first: 5) {
        nodes {
          id
          title
          slug
          uri
          date
        }
      }
    }
  `;

  onMount(async () => {
    try {
      const { data, errors } = await fetchGraphQL(GET_POSTS_QUERY);
      if (errors) {
        console.error('Errors returned from server:', errors);
      } else {
        posts = data.posts.nodes;
      }
    } catch (error) {
      console.error('An error occurred:', error);
    } finally {
      loading = false;
    }
  });
</script>

{#if loading}
  <div class="flex justify-center items-center min-h-screen">
    <p>Loading...</p>
  </div>
{:else}
  <div class="bg-custom-dark min-h-screen text-hot-pink p-8">
    <h1 class="text-4xl mb-8 font-bold text-center">Fran's Headless WP with SvelteKit Blog Example</h1>
    <ul class="space-y-4">
      {#each posts as post}
        <li class="text-center">
          <a href={`/blog${post.uri}`} class="text-2xl hover:text-pink-600">{post.title}</a>
          {#if post.date}
            <p class="text-gray-500 text-sm">{new Date(post.date).toLocaleDateString()}</p>
          {:else}
            <p class="text-gray-500 text-sm">No date available</p>
          {/if}
        </li>
      {/each}
    </ul>
  </div>
{/if}

This code block executes a single post detail page, dynamically grabbing the post data by its URI.

Just to be clear, this is meant to focus on the framework agnosticism of the headless WordPress platform and this edge cache library. It is not meant to be a SveltKit tutorial. With that being said, you should be able to follow along and use whatever dynamic route file you have with whatever framework you choose.

5. Now that we created the path we want to purge, let’s create the API route we will use to execute the purge logic. Navigate to src/routes and create an api/purge/+server.js folder and file. Add this code block:

// src/routes/api/+server.js
import { purgePaths } from '@wpengine/edge-cache';

export async function GET({ url }) {
    const uri = url.searchParams.get('uri'); // Extracting 'uri' parameter from the query string

    try {
        if (uri) {
            // Purge the specific blog path
            await purgePaths(`/blog/${uri}`);
            return {
                status: 200,
                body: { success: true, message: `Cache purged for /blog/${uri}` }
            };
        } else {
            return {
                status: 400,
                body: { success: false, message: 'URI is required' }
            };
        }
    } catch (error) {
        return {
            status: 500,
            body: { success: false, message: error.message }
        };
    }
}

At the top of this file, we import the purgePaths function from the WP engine Edge Cache library, which is used to purge cache for specific paths.

Following that we define an async function that handles GET requests to that endpoint. Then we have a const that extracts the uri param from the query string for the request URL.

If the uri is provided, it calls the purgePath function with the full path, then returns a success response with status 200.

If uri is not present, it returns a 400 status with an error message indicating that the URI is required.

Following that, we catch any errors during the purge process and return a 500 status with the error message, indicating an internal server error occurred during the operation.

This version allows cache purging through a GET request by passing the uri parameter directly in the query string.

You would have a URL that looks something like this when you pass the URI parameter directly into the query string:

https:your-wpengine-site/api/purge?uri=some-blog-post

Once you have this endpoint, you can set it up in a webhook, for example, to automate the purge. This ensures that when a user visits a blog detail page, you can selectively purge the cache for that specific path, keeping your content up-to-date efficiently.

Conclusion

The WP engine Edge Cache Purge Library, allows you to manage edge cache dynamically and keep your content up-to-date with targeted purges. This setup offers a flexible, framework-agnostic solution that fits well with any Node. js-based application.

We hope you have a better understanding and grasp of the edge cache library. As always, we’re stoked to hear your feedback and what you are building in headless WordPress! Hit us up in our Discord!!

The post WP engine’s Node.js Edge Cache Purge Library for the headless WordPress platform appeared first on Builders.

]]>
https://wpengine.com/builders/wp-engines-node-js-edge-cache-purge-library-for-the-headless-wordpress-platform/feed/ 0
How to Customize WPGraphQL Cache Keys 💾🔑 https://wpengine.com/builders/how-to-customize-wpgraphql-cache-keys/ https://wpengine.com/builders/how-to-customize-wpgraphql-cache-keys/#respond Tue, 27 Aug 2024 16:24:10 +0000 https://wpengine.com/builders/?p=31697 Caching is important in optimizing performance for headless WordPress setups. The WPGraphQL Smart Cache plugin helps manage caching for GraphQL queries, ensuring faster response times. In this guide, we’ll walk […]

The post How to Customize WPGraphQL Cache Keys 💾🔑 appeared first on Builders.

]]>
Caching is important in optimizing performance for headless WordPress setups. The WPGraphQL Smart Cache plugin helps manage caching for GraphQL queries, ensuring faster response times. In this guide, we’ll walk you through setting up your WordPress environment, installing the necessary plugins, and customizing GraphQL cache keys to better suit your specific needs.

Prerequisites

Before reading this article, you should have the following prerequisites checked off:

If you’re looking for a headless platform to develop on, you can get started with a WP Engine Headless WordPress sandbox site for free:

Understanding Default WPGraphQL Smart Cache Behavior

WPGraphQL Smart Cache automatically tags cached responses with keys derived from the GraphQL queries. These keys are linked to specific WordPress data (e.g., posts, pages, taxonomies). When relevant data is updated, the associated cache is invalidated. 

For example, a query that retrieves posts with specific categories and tags will generate cache keys like list:post, list:category, and list:tag. If any of these categories or tags are updated, the entire cache is invalidated, ensuring the data stays current.  

In addition to the list:$type_name keys, individual node IDs are also included. 

These individual IDs are used to purge cache when updates or deletes happen.

The list:$type_name is used to purge when a new thing is published.  For example list:post will be purged when a new post is published, but purge( "post:1" ) would be purged when post 1 is updated or deleted.   

Let’s see this in action.  Navigate to your WP admin, then open your WPGraphQL IDE. Copy and paste this query:

query GetPosts {
  posts {
    nodes {
      title
      uri
    }
  }
}

When you press play in your IDE, this will make a query to your site’s WPGraphQL endpoint. 

Then WPGraphQL will return headers that caching clients can use to tag the cached document. Next, Open your dev tools.  In this case, I am using Google Chrome.  When I open up the dev tools and inspect the response headers, you should see this:

Here, we see the X-GraphQL-Keys header with a list of keys. If this query were made via a GET request to a site hosted on a host that supports it, the response would be cached and “tagged” with these keys.

For this particular query, we see the following keys:

  • Hash of the Query: This is a unique identifier which is in this example

4382426a7bebd62479da59a06d90ceb12e02967d342afb7518632e26b95acc6f for the specific query made. It ensures that the exact same query returns the same cached response unless invalidated.

  • Operation Type (graphql:Query): Indicates that the operation is a GraphQL query, as opposed to a mutation or subscription
  • Operation Name (operation:GetPosts): Identifies the specifically named query, in this case, GetPosts, which helps in targeting this operation for caching or invalidation.
  • List Key (list:post): This key identifies that the query is fetching a list of posts. Any changes to the list of posts would trigger cache invalidation.
  • Node ID (cG9zdDox): This represents the specific node (e.g., a post) that was resolved in the query. Changes to this node will invalidate the cache for this query.

If a purge event for one of those tags is triggered, the document will be tagged with these keys and purged (deleted) from the cache.

Understanding Cache Invalidation with WPGraphQL Smart Cache

WPGraphQL Smart Cache optimizes caching by sending the keys in the headers, but the caching client (e.g., Varnish or Litespeed) needs to use those keys to tag the cache. WPGraphQL Smart Cache itself does not tag the cached document; it provides the caching client info (the keys) to tag the cached document with. A supported host like WP engine works with WPGraphQL Smart Cache out of the box.

Let’s discuss how invalidation works:

WPGraphQL Smart Cache listens to various events in WordPress, such as publishing, updating, or deleting content, and triggers cache invalidation (or “purge”) based on these events.

Detailed Key Breakdown:

Publish Events (purge('list:$type_name')): When a new post or content type is published, the cache for the entire list associated with that content type (e.g., all posts) is purged. This ensures that any queries fetching this list will be up-to-date.

  Update Events (purge('$nodeId')): When an existing post or content type is updated, the cache for that specific node (e.g., a single post) is purged. This allows the updated content to be fetched without affecting the entire list.

    Delete Events (purge('$nodeId')): Similarly, when a post or content type is deleted, the cache for that specific node is purged, ensuring that the deleted content is no longer served from the cache.

Why This Matters:

These targeted cache invalidations help maintain the balance between performance and data freshness. By only purging the cache when necessary and only for the relevant data, WPGraphQL Smart Cache ensures that users receive up-to-date content without unnecessary cache purges, which can negatively impact performance.

This invalidation strategy is crucial for optimizing the performance of headless WordPress setups using WPGraphQL, especially in dynamic environments where content changes frequently.

How Cache Invalidation and Cache Tags Work Together

Now that we’ve explored how cached documents are tagged and how cache invalidation works in WPGraphQL Smart Cache, let’s see how these concepts interact.

When a GraphQL query is executed, specific cache keys (tags) are associated with the cached response. These tags correspond to the data queried, such as posts, categories, or specific node IDs. The cache invalidation strategy then ensures that when relevant data changes occur in WordPress, the associated cached documents are purged based on these tags.

Example: Invalidation Scenarios for a GetPosts Query

  1. Publishing a New Post (purge('list:post')):
    • When a new post is published, the entire list of posts in the cache (tagged with list:post) is invalidated. This ensures that the new post will appear in any subsequent queries that fetch this list.
  2. Updating or Deleting a Specific Node (purge('$nodeId')):
    • If the “Hello World” post (with the ID cG9zdDox) is updated or deleted, the cache for that specific node is purged. This allows the updated or deleted content to be accurately reflected in any future queries.
  3. Manually Purging Cache (purge('graphql:Query')):
    • Clicking “Purge Cache” in GraphQL > Settings > Cache page triggers a manual cache purge for all queries. This can be useful when you want to ensure that all cached data is refreshed, regardless of specific events.
  4. Operation Name or Query Hash-Based Purge:
    • Custom purge events can be manually triggered based on the operation name (e.g., GetPosts) or the hash of the query. This level of control allows you to finely tune when and how caches are invalidated.

These strategies work together to ensure that the cache is only invalidated when necessary, providing up-to-date data without unnecessary performance overhead. For instance, when the “Hello World” post is updated, it’s reasonable to expect that the cache for the GetPosts query should be purged so that any queries return the most current data. This fine-grained control over cache invalidation ensures that your headless WordPress site remains performant while delivering fresh content.

Why Would You Need to Customize WPGraphQL Cache Keys?

In some scenarios, the default caching behavior might be too broad, leading to frequent cache invalidations.  This is especially true for more complex queries. 

For instance, if your query includes categories and tags, any update to these taxonomies will invalidate the cache, even if those changes don’t affect the specific posts you’re querying. Customizing cache keys allows you to fine-tune this behavior, ensuring that only relevant updates trigger cache invalidation, thereby improving performance.

For example, consider the following query:

{
  posts {
    nodes {
      id
      title
      tags {
        nodes {
          id
          name
        }
      }
    }
  }
  categories {
    nodes {
      id
      name
    }
  }
  tags {
    nodes {
      id
      name
    }
  }
}

This query retrieves a list of posts, along with all categories and tags. When this query is executed, the response includes the posts, categories, and tags that match the query as shown here:

The X-GraphQL-Keys header shows that the cached document is tagged with list:post, list:category, and list:tag. This tagging means that the cache will be invalidated whenever there’s a change in any of these entities—whether it’s a new post, category, or tag.

While this behavior ensures that your cache is up-to-date, it can lead to excessive cache invalidation. For instance, if a new tag is created and assigned to a post not included in this query, it will still trigger a purge('list:tag'), invalidating the cache for this query.

This means the cache could be cleared more often than you want for your specific use case, which could negatively impact performance.

Just A Note

Just a note, consider this query from the original article on this subject:

query GetPostsWithCategoriesAndTags {
  posts {
    nodes {
      id
      title
      categories {
        nodes {
          id
          name
        }
      }
      tags {
        nodes {
          id
          name
        }
      }
    }
  }
}

The WPGraphQL team changed things to only track list: types from the root.  So, if you run this query, your list of categories won’t be tracked because it is not at the root.

The Problem

The problem is that the list:category and list:tag keys could cause this document to be purged more frequently than you might like. WPGraphQL tracks precisely, but it doesn’t know your specific intention and what you care about.

For example, you might simply not care if this particular query is “fresh” when terms change. OR you might ONLY care for this query to be fresh when terms change. 

WPGraphQL doesn’t know the intent of the query, only what the query is.

Fortunately, you can customize the cache keys to better suit your specific needs, reducing unnecessary cache invalidations and improving performance.

Customizing Cache Keys

By customizing the cache keys, you can ensure that the cache is only invalidated when changes you believe are relevant to your use case occur. This involves fine-tuning the tags associated with your queries, allowing you to maintain optimal performance without sacrificing data accuracy.

Let’s do this by navigating to our WP admin and modifying the functions.php file.  Go to Appearance> Theme File Editor.  Select the functions.php file from your active theme.

Insert this code snippet at the bottom of your functions.php file to customize the cache keys for a specific GraphQL operation.  In this case, let’s add an operator name to the query we used in the section before.  We are calling our operation GetPostWithCategoriesAndTags:

add_filter( 'graphql_query_analyzer_graphql_keys', function( $graphql_keys, $return_keys ) {
    $keys_array = explode( ' ', $return_keys );
    if ( ! in_array( 'operation:GetPostsWithCategoriesAndTags', $keys_array, true ) ) {
        return $graphql_keys;
    }
    $keys_array = array_diff($keys_array, ['list:tag', 'list:category']);
    $graphql_keys['keys'] = implode( ' ', $keys_array );
    return $graphql_keys;
}, 10, 5 );

You should have something that looks like this:

This snippet customizes the cache keys for the GetPostsWithCategoriesAndTags operation. It removes the list:tag and list:category keys from the cache, preventing their updates from invalidating the cache for this specific query. The array_diff() function is used to filter out the unwanted keys, and the modified keys are then reassembled into a string and returned.

Let’s test this now in WPGraphQL IDE and the browser dev tools:

Stoked!!!  Now as you see in the dev tools image, publishing new categories and tags, which triggers purge( 'list:category' ) and purge( 'list:tag' ) will not purge this document. 

We’re getting the benefits of cached GraphQL documents. The document is invalidated when the post is updated or deleted, but we’re letting the cache remain cached when categories or tags are created.

Conclusion

We hope you have a better understanding of using filters, as demonstrated above, to customize your cache tagging and invalidation strategies to better suit your project’s specific needs. By taking control of how cache keys are managed, you can optimize performance and reduce unnecessary cache invalidations.
As always, we look forward to hearing your feedback, thoughts, and projects so hit us up in our headless Discord!

The post How to Customize WPGraphQL Cache Keys 💾🔑 appeared first on Builders.

]]>
https://wpengine.com/builders/how-to-customize-wpgraphql-cache-keys/feed/ 0
On-Demand ISR Support for Next.js/Faust.js on WP engine’s headless WordPress Platform https://wpengine.com/builders/on-demand-isr-support-for-next-js-faust-js-on-wp-engines-headless-wordpress-platform/ https://wpengine.com/builders/on-demand-isr-support-for-next-js-faust-js-on-wp-engines-headless-wordpress-platform/#respond Mon, 12 Aug 2024 13:19:12 +0000 https://wpengine.com/builders/?p=31676 WP engine’s headless WordPress hosting platform is the go-to end-to-end solution for the headless approach. In this article, I will discuss and guide you through the easy implementation of the […]

The post On-Demand ISR Support for Next.js/Faust.js on WP engine’s headless WordPress Platform appeared first on Builders.

]]>
WP engine’s headless WordPress hosting platform is the go-to end-to-end solution for the headless approach. In this article, I will discuss and guide you through the easy implementation of the latest feature on the platform: Support for On-Demand ISR on Next.js/Faust.js. By the end of this article, you will have a better understanding of On-Demand ISR and using the headless WP platform to support it with Next.js/Faust.js.

If you prefer the video format of this video, you can access it here:

Prerequisites

Before reading this article, you should have the following prerequisites checked off:

  • Basic knowledge of Next.js and Faust.js.
  • A WP engine headless WordPress account and environment set up.
  • Node.js and npm are installed on your local machine.

If you do not and need a basic understanding of Next.js and Faust.js, please visit the docs:

https://nextjs.org/docs

https://faustjs.org/tutorial/get-started-with-faust

What is On-Demand ISR?

On-Demand Incremental Static Regeneration (ISR) is a feature that allows you to manually purge the Next.js cache for specific pages, enabling more dynamic and timely updates to your site. Typically, in regular ISR when you set a revalidate time, such as 60 seconds, all visitors will see the same generated version of your site for that duration. The cache is only invalidated when someone visits the page after the revalidation period has passed.

With the introduction of On-Demand ISR in Next.js version 12.2.0, you can now trigger cache invalidation manually, providing greater flexibility in updating your site. This is particularly useful when:

  • Content from your headless CMS is created or updated.
  • E-commerce metadata changes, such as price, description, category, or reviews, are made.

This feature streamlines the process of reflecting changes on your site in real-time, ensuring that your content is always fresh and up-to-date.

Why Use it in headless WordPress?

In headless WordPress, the front end is decoupled from the WordPress backend, often using Next.js and Faust.js to render the website. This architecture offers several advantages, such as potential for improved performance, enhanced security, and greater flexibility in choosing front-end technologies.

However, one challenge with headless WordPress is ensuring that content changes in WordPress are reflected on the front end without sacrificing performance. This is where On-Demand ISR becomes crucial. By leveraging On-Demand ISR, you can achieve the following benefits:

Up-to-date Content: On-Demand ISR allows your site to fetch the latest content updates from WordPress manually, as needed. Unlike regular ISR, which checks for updates at specified intervals, On-Demand ISR lets you trigger cache invalidation whenever content is created or updated in WordPress. This ensures that users see the most recent content without waiting for a revalidation period.

Enhanced Performance: Since On-Demand ISR updates only the specific pages that need regeneration at the moment they are triggered, your site remains fast and responsive. Initial load times are minimized, and only the changed content is updated, reducing server load and build times.

SEO Benefits: Static pages are highly favored by search engines due to their speed and reliability. With On-Demand ISR, you maintain the SEO advantages of static generation while ensuring that your content is always fresh and relevant, as updates are reflected immediately after they are triggered.

Scalability: On-Demand ISR enables your site to handle large volumes of content efficiently. Whether you’re running a blog with frequent updates or an e-commerce site with dynamic product listings, On-Demand ISR ensures that your site scales seamlessly.

All those benefits got me stoked! Let’s get it on our Next.js and Faust.js sites!

Configuring Next.js with the headless WP Platform for On-Demand ISR

Let’s configure our Next.js application to work with On-Demand ISR.

Here is the docs link to the headless WordPress platform support for On-Demand ISR.

Atlas-Next Package

In your Next.js project, go to your terminal and install the @wpengine/atlas-next package:

npm install --save @wpengine/atlas-next

This package provides improved support on the headless WP platform. Once you install it, ensure it is in your project by navigating to your package.json file at the root of your project:

{
  "name": "atlas-on-demand-isr",
  "version": "0.1.0",
  "private": true,
  "scripts": {
    "dev": "next dev",
    "build": "next build",
    "start": "next start",
    "lint": "next lint"
  },
  "dependencies": {
    "@wpengine/atlas-next": "^1.3.0-beta.0",
    "next": "14.2.4",
    "react": "^18",
    "react-dom": "^18"
  },
  "devDependencies": {
    "eslint": "^8",
    "eslint-config-next": "14.2.4",
    "postcss": "^8",
    "tailwindcss": "^3.4.1"
  }
}

Now that you have verified the proper installation, staying at the root of your project,  modify your next.config.js file like so:

const { withAtlasConfig } = require("@wpengine/atlas-next");

/** @type {import('next').NextConfig} */

const nextConfig = {

 // Your existing Next.js config

};

module.exports = withAtlasConfig(nextConfig);

Faust.js Wrapper and Next.js/React.js Versions

If you are using Faust.js, please note that you need to update your Next.js to a minimum of 13.5.1 and React versions to 18.3.1 for this feature to work in Faust. The npx utility command that pulls down the Faust.js boilerplate from the Faust docs comes with Next.js 12 by default. So please change this if using that package.

Following the update to your versions, all you need to do is modify your next.config.js file using the withFaust wrapper:

const { withFaust } = require("@faustwp/core")
const { withAtlasConfig } = require("@wpengine/atlas-next")

/** @type {import('next').NextConfig} */
const nextConfig = {
  // Your existing Next.js config
}

module.exports = withFaust(withAtlasConfig(nextConfig))

Next, we need to verify that it works.  Run your app in dev mode via npm run dev and you should see this output in your terminal:

Stoked! It works!

Create an API route

The first thing we need to do is create an API route.  This will allow you to pass the path to be revalidated as a parameter.

Step 1. Create the API route file: Navigate to the pages/api directory in your Next.js project and create a new file named revalidate.js.

Step 2. Add the API Route code: Open revalidate.js and add the following code:

export default async function handler(req, res) {
  // Check for a secret token to authenticate the request
  if (req.query.secret !== process.env.MY_SECRET_TOKEN) {
    return res.status(401).json({ message: 'Invalid token' });
  }

  const path = req.query.path;

  // Ensure the path parameter is provided
  if (!path) {
    return res.status(400).json({ message: 'Path query parameter is required' });
  }

  try {
    // Revalidate the specified path
    await res.revalidate(path);
    return res.json({ revalidated: true });
  } catch (err) {
    return res.status(500).json({ message: 'Error revalidating', error: err.message });
  }
}

Step 3.  Configure the environment variables: Create a .env.local file in the root of your project if it does not exist already.

Step 4. Next, create a secret token. This code sets up an API route that checks for a secret token for security, validates the presence of the path parameter, and triggers the revalidation of the specified path.

Once you have Node.js installed, you can use it to generate a secret token.

Open Your Terminal: Start by opening your terminal or command prompt.

Generate a Secret Token: Run the following command in your terminal:

node -e "console.log(require('crypto').randomBytes(32).toString('hex'))

This command uses the crypto module to generate a 32-byte random string in hexadecimal format. The output will be your secret token.

Once the token is generated, copy and paste it into your .env.local file and give it an environment variable name e.g. `REVALIDATION_SECRET` 

It should look like this: REVALIDATION_SECRET=your-secret-key

Use the API route to configure cache revalidation

To configure cache revalidation in the headless WordPress setup, you could follow one of the approaches below:

Use a webhook plugin: You can use plugins like WP Webhooks to enable webhook functionality and trigger the API endpoint you’ve just created when relevant events occur, such as when a post is published or updated.

When you have your secret key generated and need an API endpoint, append a query string parameter to it with the key and value pair, where the key is secret and the value is the secret token. For instance:

https://your-wordpress-site.com/api/revalidate?secret=your-secret-key&path=/your-route

This is the endpoint you can embed in a field that requires it when using a plugin like WP Webhooks. The field will correlate to whatever action will happen when the endpoint that is associated with this path is hit such as updating a post.

Just a note, if you are developing locally and want to test it, you will have to manually visit the endpoint which typically spins up on port 3000 since the app will only know when to revalidate it on your local machine.

Set up WordPress hooks: You can also add actions in your WordPress theme or plugin to send requests to your Next.js API route. Here’s an example using the wp_remote_post function in php, which will send a POST request to the Next.js API route whenever a post is saved in WordPress, triggering the revalidation of the corresponding path:

function trigger_revalidation($post_id) {
  $url = 'https://your-nextjs-site.com/api/revalidate?secret=your-secret-token&path=' . get_permalink($post_id);
  
  $response = wp_remote_post($url);

  if (is_wp_error($response)) {
    error_log('Error triggering revalidation: ' . $response->get_error_message());
  }
}
add_action('save_post', 'trigger_revalidation');

Headless WordPress Platform User Portal

We now have On-Demand ISR set up with the proper configuration. The last steps are to connect our remote repository to WP engine’s headless WP platform, git push any changes, and observe On-Demand ISR working in all its cache invalidation glory.

If you have not connected your local project to a remote repository, go ahead and do so.  WP engine headless platform supports GitHub, Bitbucket, and GitLab.

Once you have connected your remote repository and added all your necessary environment variables, go ahead and build the app. If you have done this with an existing repo, you can git push the change, which will trigger a build.

When the application is finished in the build step, you are on the main page of the WP engine headless WP portal.  Navigate over to the Logs subpage:

In the Logs subpage, click the “Show logs” button on the Runtime option:

You should see the same output focusing on line 6 as you did in your terminal to ensure it’s working properly:

Awesome!!! It is implemented and working in runtime.  Now, when you edit or input new content in your WP backend, and then visit the live URL of the WP engine headless WP site you just deployed, the ISR should work on demand like so:

Limitations

At the moment, the headless WP platform supports On-Demand ISR with the following limitations:

  1. Requires @wpengine/atlas-next package: To enable On-Demand ISR on Atlas, you must use the @wpenngine/atlas-next package and follow the setup steps outlined in the previous sections of this document.
  2. On-demand ISR for App Router is not supported: The App Router is a new routing system in Next.js that enhances routing capabilities with features like server components and nested layouts. However, Atlas currently supports On-Demand ISR only in the context of the traditional Pages Router. This means that methods like revalidatePath and revalidateTag, which are used for revalidation in the App Router, are not compatible with the headless WP platform’s ISR mechanism. For more details on the App Router and its data fetching methods, you can refer to the Next.js documentation here.
  3. Rewrites are not supported: Rewrites in Next.js allow you to change the destination URL of a request without altering the URL visible to the user. However, On-Demand ISR on the headless WP platform does not support rewrites. This means that if your Next.js application relies on rewrites, the On-Demand ISR feature might not function as expected. You can learn more about rewrites here.
  4. Not compatible with Next.js I18N: Since Next.js uses rewrites for internationalization, this feature is not supported on the headless WP platform due to the rewrite limitation mentioned above.
  5. Next.js >=13.5 is required: To be able to use this feature, you need to update your application to Next.js version 13.5 or higher.

NoteRedirects (which, unlike rewrites, actually change the URL in the user’s browser to the specified destination) are supported.

If you want to give us feedback on how we can make things better for this feature or anything else with the platform, please visit our feedback form.

Conclusion

Implementing On-Demand Incremental Static Regeneration (ISR) with Next.js and Faust.js on WP engine’s headless WP platform is a game-changer for maintaining performance and up-to-date content in a headless WordPress setup. By following the steps outlined in this guide, you can leverage On-Demand ISR to ensure your site remains both fast and current, without the need for full rebuilds. 

The integration with the platform also simplifies the deployment and management process, providing a seamless workflow from development to production. 

As always, we look forward to hearing your feedback, thoughts, and projects so hit us up in our headless Discord!

The post On-Demand ISR Support for Next.js/Faust.js on WP engine’s headless WordPress Platform appeared first on Builders.

]]>
https://wpengine.com/builders/on-demand-isr-support-for-next-js-faust-js-on-wp-engines-headless-wordpress-platform/feed/ 0
Contributing to Open Source Projects https://wpengine.com/builders/contributing-to-open-source-projects/ https://wpengine.com/builders/contributing-to-open-source-projects/#respond Wed, 24 Jul 2024 21:55:02 +0000 https://wpengine.com/builders/?p=31667 This article aims to guide developers on best practices for contributing to Faust.js and other Open Source Software (OSS) and provide actionable steps for getting started with contributions. Whether you’re […]

The post Contributing to Open Source Projects appeared first on Builders.

]]>
This article aims to guide developers on best practices for contributing to Faust.js and other Open Source Software (OSS) and provide actionable steps for getting started with contributions. Whether you’re a seasoned developer or just starting, this guide will help you navigate the OSS ecosystem and make meaningful contributions.

What Defines an OSS?

Open Source Software (OSS) is software released with a license that allows anyone to view, modify, and distribute the source code. This openness fosters a collaborative environment where developers can contribute, innovate, and improve the software. OSS is the backbone of many critical technologies and has a profound impact on the software industry by promoting transparency, security, and community-driven development.

Understanding the Open Source Ecosystem

In order to understand the OSS ecosystem, let’s define the types of contributions to it.  There are four main types of contributions:

  1. Code Contributions: Adding new features, fixing bugs, and improving software performance and security are common ways to contribute code. These contributions directly impact the project’s functionality and reliability.
  2. Documentation Improvements: Clear and comprehensive documentation is crucial for any OSS project. Contributing to user guides, API references, and tutorials helps new users and contributors understand and utilize the software effectively.
  3. Community Support: Helping other users in forums, social media, and chat channels builds a supportive and inclusive community. Providing answers, sharing knowledge, and offering guidance are valuable contributions.
  4. Testing and Bug Reporting: Identifying and reporting bugs, performing quality assurance, and testing new features ensure the software remains robust and reliable. Thorough testing and detailed bug reports help maintainers address issues efficiently.

Now that we have listed out the types of contributions, let’s discuss the licensing and legal considerations in OSS.

In OSS, the three most common types of licenses are listed below:

  • MIT License: Permissive and simple, allowing almost unrestricted reuse. It’s one of the most popular OSS licenses due to its simplicity and flexibility.
  • GPL (GNU General Public License): This license requires derivative works to be open source. It ensures that software remains free and open for future generations.
  • Apache License: This license is similar to MIT but includes an explicit grant of patent rights from contributors. It is preferred for projects that want to protect against potential patent litigation.

Understanding Contributor License Agreements (CLAs):

CLAs are legal agreements that contributors must sign, giving the project permission to use their contributions. They clarify the rights and obligations of both contributors and project maintainers, ensuring that contributions can be used freely within the project.  The Faust.js project uses such an agreement.

Headless WordPress OSS

The three main OSS projects for headless WordPress that we use at WP Engine are Faust.js, WordPress, and WPGraphQL:

Faust.js: A toolkit for Next.js for building modern, headless WordPress sites. It simplifies the development of WordPress sites by leveraging the power of React and WPGraphQL.

WPGraphQL: WPGraphQL is a free, open-source WordPress plugin that provides an extendable GraphQL schema and API for any WordPress site.

WordPress: A content management system (CMS) powering a significant portion of the web. Its strong community and extensive ecosystem of plugins and themes make it a versatile and widely-used platform.

Contributions can include adding new features, fixing bugs, or enhancing the documentation. For example, you might implement a new WPGraphQL query, optimize performance, or write a tutorial.

Within the contribution areas, the focus might include improving core functionality, creating example projects, or enhancing developer tools. The Faust.js community is welcoming and always looking for enthusiastic contributors.

Getting Started with Contributing

The first thing you need to do is identify a project that matches your interests and skill set. In this case, we are focusing on JavaScript, WordPress, and WPGraphQL, which are related to web development.

Next, look for active projects with regular updates, responsive maintainers, and welcoming communities. Check for the frequency of commits, issue resolution, and community engagement on forums and social media.

Familiarizing Yourself with the Project

Here are some steps you can take to familiarize yourself with the project you choose.

The main thing a project’s documentation tells you about it is its architecture, purpose, and processes. Read the documents and contribution guidelines to understand how to contribute effectively. Good documents also provide insights into the project’s governance, purpose, and processes.

The next thing to understand is the project codebase. You can do this by exploring the repository, reading the README.md and CONTRIBUTING.md files, and diving into the directory structure and any key components.

Once you do that, you can review open issues to identify areas where you can contribute.  Look for issues that are tagged with “good first issue” or “Need Help” as these are great opportunities to tackle for new contributors.

Best Practices for Contributing to OSS

There are multiple ways to contribute effectively with best practices in OSS.  Here are some for you to consider.

Effective Communication

Communication is a key practice in OSS contribution. You can do this by participating in chat channels, forums, mailing lists, and social media to stay informed about project updates. This will help you stay on top of ongoing changes and decisions and where the project is heading.

When you ask for help or provide feedback, use clear and respectful language. Be as specific as possible about your issue and provide context to make your point clearer and easier to understand.

You can also help others by answering questions, sharing your knowledge, and providing guidance. Your expertise can benefit other community members and foster a collaborative environment.

Quality Code

Quality code is a crucial part of overall best practice when contributing to OSS. Follow the project’s standards and guidelines to write maintainable and readable code. Consistent coding practices ensure that your contributions are compatible with the existing codebase.

Test your code to ensure software reliability. Write unit tests, integration tests, and end-to-end tests as appropriate to validate your changes.

Writing quality code includes detailed commit messages and pull request descriptions. These will help the maintainers review your changes. Clearly explain your changes, why they are necessary, and how they were implemented.

Review and Feedback Process

Understanding the review and feedback process is important for contributions to OSS. This involves familiarizing yourself with code reviews, responding to feedback constructively, and learning from the review process to improve your skills.

Familiarize Yourself with the Code Review Process: Start by examining past reviews to understand the maintainers’ criteria and expectations. This will give you insights into what maintainers look for in contributions and common areas for improvement. Observing how other contributions are reviewed can also provide valuable lessons.

Responding to Feedback: When you receive feedback on your contributions, be receptive and open-minded. Address the reviewers’ comments thoughtfully and make the necessary changes to improve your work. Constructive feedback is an opportunity for growth, so approach it with a positive attitude and a willingness to learn.

Learning from Reviews: Use the feedback from code reviews to enhance your skills and contribute more effectively. Each review is a learning opportunity that can help you write better code and understand best practices. Embrace the review process as a valuable part of your development journey, and apply the lessons learned to future contributions.

Conclusion

I hope this guide has given you a clear understanding of how to contribute to OSS projects. By following the steps and best practices outlined here, you’ll be well-prepared and excited to start contributing to projects like Faust.js, WPGraphQL, and WordPress. Your contributions will not only positively impact these communities but also help advance the software you use and care about.  As always, we look forward to hearing your feedback, thoughts, and projects so hit us up in our headless Discord!

The post Contributing to Open Source Projects appeared first on Builders.

]]>
https://wpengine.com/builders/contributing-to-open-source-projects/feed/ 0
Using Composer to Manage Plugins and Deploy to WP Engine https://wpengine.com/builders/using-composer-manage-plugins-deploy/ https://wpengine.com/builders/using-composer-manage-plugins-deploy/#respond Thu, 11 Jul 2024 18:24:08 +0000 https://wpengine.com/builders/?p=4931 Manage your WordPress dependencies with Composer and deploy to WP Engine with GitHub Actions.

The post Using Composer to Manage Plugins and Deploy to WP Engine appeared first on Builders.

]]>
We recently covered how to do branched deploys to WP Engine with GitHub Actions. Today, let’s explore managing plugin dependencies with Composer when deploying to WP Engine.

It helps if you are familiar with Composer, WPackagist, and Git version control. However, if you are not, here is an excellent resource to get you started: Managing your WordPress site with Git and Composer. Also, these instructions assume you have an existing WordPress site hosted on WP Engine that you are retroactively putting under version control.

Here is what we’ll be covering:

  • Version control (Git) – you will only need the wp-content/ directory under version control. We’ll let WP Engine maintain WordPress core automatic updates.
  • Composer—You will use Composer and WPackagist.org to manage WordPress plugins. Note that it will be your responsibility to manage plugin updates with this Composer setup, and utilizing Smart Plugin Manager or WordPress auto-updates is not covered.
    • Bonus: You will learn to install and manage ACF PRO as a Composer dependency.

Overview of project organization

Below is an example of your final GitHub repository. We will explore these in greater detail. (Also, check out the demo codebase on GitHub to follow along with the proposed structure.)

  • .github/workflows/[dev|staging|production].yml: These represent our GitHub Action configuration files for deploying our codebase to their corresponding environments. Be sure to become familiar with the WP Engine GitHub Action for Site Deployment, which relies on using rsync to sync the repository files with the targeted server.
  • rsync-config: These files configure our WP Engine GitHub Action for Site Deployment. The action relies on running a rsync between the GitHub repository and the targeted environment.
    • excludes.txt: Referenced in the .github/workflows/[dev|staging|production].yml file as the explicit rsync FLAGS. These are any items we want to exclude from being deleted each time our GitHub Action runs a rsync.
      • Hint: these files likely exist on the final WP Engine server and we do not want to remove them every time an rsync is executed in our GitHub Action.
    • includes.txt: Referenced in the .github/workflows/[dev|staging|production].yml GitHub Action as the explicit rsync FLAGS. These are any items we want to include in our GitHub Action rsync.
      • Hint: these will likely represent the un-ignored items in your project .gitignore which we’ll cover below.
  • bin/post-deploy.sh: This is how you pass any SCRIPTS to the GitHub Action to run commands on the final destination server.
    • Tip: you can run WP-CLI and Composer install commands on the final WP Engine environment.
  • plugins/: You will rely on Composer and WPackagist to install standard and stable plugins. However, we will also show you how you might handle a custom plugin.
    • plugins/demo-plugin: Represents any custom plugins you may want to version control. You could have as many of these as you like. For example, you could organize your custom functionality as plugins/foo-plugin, plugins/bar-plugin.
  • themes/: Similar to our plugins, you will likely version control a single theme for the final destination.
    • themes/demo-theme: Represents a single, custom theme you would have under version control.
  • .gitignore: It is critical to tell Git what you want to ignore from being under version control, as well as what you do not want to ignore (yes, this sounds odd, but trust us).
  • composer.json: Lists your project’s direct dependencies, as well as each dependency’s dependencies, and allows you to pin relative semantic versions for your dependencies.
  • composer.lock: Allows you to control when and how to update your dependencies.

Start by organizing a copy of your WordPress site’s wp-content/ directory to mirror the organization noted above. It is recommended to create this setup on your local computer. You can access a full site backup from WP Engine’s User Portal.  It is okay if there are other directories within your wp-content/ directory. You will tell Git what you want to ignore, or not ignore in the next step.

Create a .gitignore

Create a .gitignore file in your WordPress installation’s wp-content/ directory and place the code below:

.gitignore (full source)

#---------------------------
# WordPress general
#---------------------------
# Ignore the wp-content/index.php
/index.php

#---------------------------
# WordPress themes
#---------------------------
# Ignore all themes
/themes/*
# DO NOT ignore this theme!
!/themes/demo-theme

#---------------------------
# WordPress plugins
#---------------------------
# Ignore all plugins
/plugins/*
# DO NOT ignore this plugin!
!/plugins/demo-plugin

#---------------------------
# WP MU plugins: these are
# managed by the platform.
#---------------------------
/mu-plugins/

#---------------------------
# WP uploads directory
#---------------------------
/uploads/

#---------------------------
# WP upgrade files
#---------------------------
/upgrade/

#---------------------------
# Composer
#---------------------------
/vendor
auth.json
.env
.env.*
!.env.example

A few key things to note from the code above:

  • /plugins/*: this ignores any directories nested within the plugins/ directory.
    • !/plugins/demo-plugin: overrides the previous /plugins/* to allow the demo-plugin to not be ignored, and instead, it is version-controlled. 
  • /themes/*: this ignores any directories nested within the themes/ directory.
    • !/themes/demo-theme: overrides the previous /themes/* to allow the demo-theme to not be ignored, and instead it is version controlled. 

You can adjust the demo-plugin or demo-theme examples to work with your setup.

Set up Composer with WPackagist integration

Composer allows you to manage your PHP dependencies. WPackagist mirrors the WordPress plugin and theme directories as a Composer repository. 

Typically, you could consider utilizing Composer for PSR-4 / PSR-0 namespacing, linting, and unit testing. We’ll only focus on demonstrating how you might pull in some standard WordPress plugins. 

Here is a composer.json that installs a few example plugins from WPackagist: Two Factor, Gutenberg, and WordPress SEO. These are here for demonstration and feel free to replace them with plugins that are standard to your workflow.

composer.json (full source)

{
   "name": "wpe/composer-demo",
   "description": "Demo with Composer and deploy to WP Engine.",
   "license": "GPL-2.0+",
   "type": "project",
   "repositories": [
       {
           "type":"composer",
           "url":"https://wpackagist.org",
           "only": [
               "wpackagist-plugin/*",
               "wpackagist-theme/*"
           ]
       }
   ],
   "require": {
       "wpackagist-plugin/two-factor": "*",
       "wpackagist-plugin/gutenberg": "*",
       "wpackagist-plugin/wordpress-seo": "*"
   },
   "extra": {
       "installer-paths": {
           "plugins/{$name}/": [
               "type:wordpress-plugin"
           ],
           "themes/{$name}/": [
               "type:wordpress-theme"
           ]
       }
   },
   "config": {
       "allow-plugins": {
           "composer/installers": true
       }
   }
}

If you’re just integrating Composer into your project the first time then you’ll likely want to now run composer install after creating a composer.json like the one above. This will generate the corresponding composer.lock file for your project with these new dependencies (and their dependencies).

If this this is your first time integrating WPackagist into your existing Composer project then the key things to note from the code above:

  • Add the WPackagist repository under the repositories entry (see lines 6-15 in the code above).
  • Add any plugins or themes you want to install from WPackagist under the require key (see lines 17-19 in the code above). Be sure to use the wpackagist-plugin/ or wpackagist-theme/ prefixed vendor name to tell Composer that you intend these to be installed through WPackagist.
  • Set the installer-paths for your plugins and themes under the extra key to tell Composer where to install your WPackagist dependencies.

Run composer update to install the new required dependencies (see lines 22-29 in the code above).

How to install ACF PRO with Composer

ACF has some useful information on installing ACF PRO with Composer. We’ll use the composer.json code above as our starting point. Here are the steps you’ll need to set this up, which we’ll go over in further detail:

  1. Copy your ACF PRO license key from the Licenses tab. To activate your license via your wp-config.php file, add the following line to the file, replacing [key] with your license key: define( 'ACF_PRO_LICENSE', '[key]'.
  2. Add the ACF package repository to your composer.json.
  3. Install the plugin by running composer require wpengine/advanced-custom-fields-pro.

Here is what your final composer.json file will look:

composer.json (full source)

{
    "name": "wpe/composer-demo",
    "description": "Demo with Composer and deploy to WP Engine.",
    "license": "GPL-2.0+",
    "type": "project",
    "repositories": [
        {
            "type": "composer",
            "url": "https://connect.advancedcustomfields.com"
        },
        {
            "type":"composer",
            "url":"https://wpackagist.org",
            "only": [
                "wpackagist-plugin/*",
                "wpackagist-theme/*"
            ]
        }
    ],
    "require": {
        "wpackagist-plugin/two-factor": "*",
        "wpackagist-plugin/gutenberg": "*",
        "wpackagist-plugin/wordpress-seo": "*",
        "wpengine/advanced-custom-fields-pro": "^6.3"
    },
    "extra": {
        "installer-paths": {
            "plugins/{$name}/": [
                "type:wordpress-plugin"
            ],
            "themes/{$name}/": [
                "type:wordpress-theme"
            ]
        }
    },
    "config": {
        "allow-plugins": {
            "composer/installers": true
        }
    }
}

There are other ways to install and activate your ACF PRO license, so be sure to check out the full documentation. If you encounter any issues along the way, then send a support message.

Set up WP Engine GitHub Action for Composer integration

WP Engine’s GitHub Action for Site Deployment relies on rsync to transfer and synchronize local GitHub repository files to the final WP Engine hosting environment. This is critical to be mindful of when you initially setup your GitHub workflows.

Additionally, since we’re organizing the root of our repository within the wp-content/ directory then we want to be sure that we configure some key deployment options in the final workflow.

production.yml (full source)

# Deploy to WP Engine Production environment
# https://wpengine.com/support/environments/#About_Environments
name: Deploy to production
on:
  push:
    branches:
     - main
jobs:
  Deploy-to-WP-Engine-Production:
    runs-on: ubuntu-latest
    steps:
    - run: echo "Preparing to deploy to WP Engine production"
    - uses: actions/checkout@v3
    - name: GitHub Action Deploy to WP Engine
      uses: wpengine/github-action-wpe-site-deploy@v3
      with:
        # Deploy vars
        # https://github.com/wpengine/github-action-wpe-site-deploy?tab=readme-ov-file#environment-variables--secrets

        # The private RSA key you will save in the Github Secrets
        WPE_SSHG_KEY_PRIVATE: ${{ secrets.WPE_SSHG_KEY_PRIVATE }}
        # Destination to deploy to WPE
        # Change to your environment name
        WPE_ENV: yourEnvironmentName

        # Deploy options

        # An optional destination directory to deploy
        # to other than the WordPress root.
        REMOTE_PATH: "wp-content/"
        # Optional flags for the deployment
        FLAGS: -azvr --inplace --delete --include-from rsync-config/includes.txt --exclude=".*" --exclude-from rsync-config/excludes.txt
        # File containing custom scripts run after the rsync
        SCRIPT: wp-content/bin/post-deploy.sh

In the code above you’ll want to replace some of the deployment variables, like WPE_ENV and be sure to setup your SSH keys (both the WP Engine SSH Gateway key and your GitHub repository’s private SSH key: WPE_SSH_KEY_PRIVATE). Again, the helpful WP Engine step-by-step guide can help you here. The key options you will want to pay close attention to are listed in the table below.

NameTypeUsage
REMOTE_PATHstringOptional path to specify a directory destination to deploy to. Defaults to WordPress root directory on WP Engine. You want this to be wp-content/.
FLAGSstringSet optional rsync flags such as --delete or --exclude-from


Caution: Setting custom rsync flags replaces the default flags provided by this action. Consider also adding the -azvr flags as needed.
a preserves symbolic links, timestamps, user permissions and ownership.
z is for compression
v is for verbose output
r is for recursive directory scanning
SCRIPTstringRemote bash file to execute post-deploy. This can include WP_CLI commands for example. Path is relative to the WP root and file executes on remote. This file can be included in your repo, or be a persistent file that lives on your server.
Deployment options for Deploy WordPress to WP Engine GitHub Action (see full list).

You will want to pass some rather specific FLAGS and a custom post-deploy SCRIPT in order to get our targeted setup accurately deploying.

Configure rsync flags

You’ll be running rsync with the  --delete flag, which is destructive and we need to be careful about what we tell it to delete. Below is what you’ll want to put in your rsync-config/excludes.txt and rsync-config/includex.txt files.

excludes.txt (full source)

# Excluding these items from being deleted each rsync

plugins/*
themes/* 
mu-plugins/
uploads/
blogs.dir/
upgrade/*
backup-db/*
advanced-cache.php
wp-cache-config.php
cache/*
cache/supercache/*
index.php
mysql.sql

.env
.env.*
auth.json
vendor

includes.txt (full source)

# Including plugins/themes that we check into
# Git so that the version in GitHub is deployed

/plugins/demo-plugin
/themes/demo-theme

# ...other plugins could go here...

Create a post-deploy script

After everything is deployed, you will want to run composer install on the WP Engine environment. This will allow you to update your dependencies with Composer locally, commit any changes, push them to the Git remote, and once the GitHub Action is run to rsync any composer.json and composer.lock changes then it’ll install any updated dependencies on the final environment. This is the SCRIPT: wp-content/bin/post-deploy.sh we set in our GitHub Actions’s YML file (above).

post-deploy.sh (full source)

#!/bin/sh

echo "Starting post deploy script..."
echo "Switch directory to wp-content/"
cd wp-content
echo "Installing Composer dependencies..."
composer install --optimize-autoloader --no-dev --no-progress

Conclusion

Utilizing Composer with WPackagist to manage your WordPress plugin dependencies can help keep teams organized and facilitate consistent workflows.

Let us know how you’re maintaining your ideal workflow—tag us on X @wpengine.

The post Using Composer to Manage Plugins and Deploy to WP Engine appeared first on Builders.

]]>
https://wpengine.com/builders/using-composer-manage-plugins-deploy/feed/ 0
Request Headers in headless WordPress with Atlas https://wpengine.com/builders/request-headers-in-headless-wordpress-with-atlas/ https://wpengine.com/builders/request-headers-in-headless-wordpress-with-atlas/#respond Wed, 03 Jul 2024 19:04:05 +0000 https://wpengine.com/builders/?p=31655 Request headers are key-value pairs sent by the client to the server in an HTTP request. They provide additional information about the request, such as the client’s browser type, preferred […]

The post Request Headers in headless WordPress with Atlas appeared first on Builders.

]]>

Request headers are key-value pairs sent by the client to the server in an HTTP request. They provide additional information about the request, such as the client’s browser type, preferred language, and other relevant data. Request headers play an important role in enabling the server to understand and process the request appropriately.

In this article, I will guide you through how Atlas, WP Engine’s headless hosting platform automates request headers to your site.

Prerequisites

Before reading this article, you should have the following prerequisites checked off:

  • Basic knowledge of Next.js and Faust.js.
  • An Atlas account and environment set up.
  • Node.js and npm are installed on your local machine.

Request Headers in Atlas

The Atlas headless WordPress hosting platform automatically appends additional request headers to every request made to your site.  These headers are designed to provide valuable geographical and temporal information about the request origin, enabling you to tailor content and functionality based on the user’s location and local time.

The headers automatically appended are:

  • wpe-headless-country: The ISO alpha-2 country code of the request’s origin.
  • wpe-headless-region: The region within the country from where the request is made.
  • wpe-headless-timezone: The timezone of the request origin in TZ Database format.

Documentation on Atlas request headers is here.

Benefits of Geolocation Headers in Atlas

Geolocation headers offer several advantages:

  • Personalization: Tailor content and experiences based on the user’s location.
  • Localization: Display content in the user’s local language or relevant to their region.
  • Analytics: Gather insights into where your users are coming from to better understand your audience.
  • Compliance: Ensure compliance with regional regulations by adapting content accordingly.

This data has several use cases. You could display localized news and weather updates or promote things based on the user’s location. Moreover, you can make custom greetings based on the user’s local time and collect data on user distribution across regions and time zones for better insight and user experience.

Now that we understand request headers and how Atlas automates them, let’s implement an example of how to display an output of the request headers on a page.

Rendering Atlas Geolocation Headers in Next.js/Faust.js pages

For this example, we are going to use Next.js. The code and folder structure/file structure in the App Router Experimental package in Faust.js work exactly the same.

Since we are using Next 14 which defaults to React Server Components, we can fetch the request headers directly in the page component on the server side.

In the root of the App directory, create a folder called local and within that folder, create a file called page.jsx.  The structure should be: app/local/page.jsx.  Once you have created that, copy and paste this code block into the page.jsx file:

import { headers } from 'next/headers';

export default async function LocalPage() {
  const country = headers().get('wpe-headless-country') || 'No country data';
  const region = headers().get('wpe-headless-region') || 'No region data';
  const timezone = headers().get('wpe-headless-timezone') || 'No timezone data';

  return (
    <div>
      <h1>Geolocation Data</h1>
      <p>Country: {country}</p>
      <p>Region: {region}</p>
      <p>Timezone: {timezone}</p>
    </div>
  );
}


Let’s break down what the code is doing here.  At the top of the file, we import the headers function from the next/headers module. This will allow us to access the request headers in this server component.

Next, we define the component, calling it LocalPage. This is our async function that renders the page.

Following that, we retrieve the values of all the headers (country, region, timezone). If the headers are not present, the message defaults to stating that there is no data for that header.

Lastly we render the component returning the JSX.  The elements on the page will show the geolocation data.   

Once you have added this page file, make sure you push these changes up to your remote repository that is supported by Atlas before deployment.  GitHub, Bitbucket, and GitLab are the supported repos.

Deploying to Atlas

The last step is to connect your remote repo to Atlas, git push any changes, and visit the local page route we made to see the geolocation data. If you have not connected to Atlas yet, please do so. Here are the docs for reference.

Once you are deployed and have a live URL, you should have a page called /local that looks like this:

Conclusion

By leveraging request headers provided by WP Engine’s Atlas, you can enhance your headless WordPress applications with geolocation data, offer personalized and localized content, gather valuable analytics, and ensure compliance with regional regulations.

As always, we look forward to hearing your feedback, thoughts, and projects so hit us up in our headless Discord!

The post Request Headers in headless WordPress with Atlas appeared first on Builders.

]]>
https://wpengine.com/builders/request-headers-in-headless-wordpress-with-atlas/feed/ 0
Beta Testing WordPress with Local Blueprints https://wpengine.com/builders/beta-testing-wordpress-local-blueprints/ https://wpengine.com/builders/beta-testing-wordpress-local-blueprints/#respond Wed, 26 Jun 2024 14:53:08 +0000 https://wpengine.com/builders/?p=4810 Start testing the latest WordPress beta quickly with Local Blueprints.

The post Beta Testing WordPress with Local Blueprints appeared first on Builders.

]]>
A new release is on the horizon! 🌅

As with each release, there are countless hours of testing to ensure the overall experience is bug-free and optimized. WordPress 6.6 is targeted to be released on July 16, 2024. Right now, you can help by testing.

Local is the go-to tool to create a WordPress sandbox and effortlessly develop WordPress sites locally, and with Local you can get testing in seconds. Here are a few options to get you started.

Check out this video or continue reading to learn about all the ways to get testing.

Option 1: WP-CLI + Local

If you already have an existing site in Local then you can just upgrade it to the latest beta with WP-CLI. Here is how:

  1. Right-click on your site and choose ‘Open site shell’, which will open your system’s terminal application and automatically launch WP-CLI.
  2. Once WP-CLI is launched then just run this command: wp core update --version=6.6-RC1
Open WP-CLI in Local

Option 2: Local + WordPress Beta Tester plugin

If you already have a Local site then you can install the WordPress Beta Tester plugin and get the latest beta.

  1. Visit your WordPress dashboard’s Appearance > Plugins, and choose ‘Add New
  2. Search and install the WordPress Beta Tester plugin
  3. Once activated, visit the Tools > Beta Testing area and update the settings to get the latest beta (select the “Bleeding edge” channel and “Beta/RC Only” stream).
WordPress Beta Tester plugin settings screen

Option 3: Local Blueprint FTW!

Save a few clicks and just import our custom Local Blueprint, which comes with everything installed and ready for testing: WordPress Beta Tester plugin with WP 6.6 Beta 1 already installed and the default Twenty Twenty-Four theme activated.

Just click the download button below and drag and drop the downloaded WordPress-Beta-Tester_6.6-RC-1.zip into your Local app to spin up a new site and get testing!

Drag and drop Blueprint into Local

(Note: the super secret WordPress username and password for the Blueprint is admin.)

Reach out to @WPEBuilders and let us know how you’re using Local and what you’re testing in the latest WordPress 6.6 beta release.

The post Beta Testing WordPress with Local Blueprints appeared first on Builders.

]]>
https://wpengine.com/builders/beta-testing-wordpress-local-blueprints/feed/ 0
Best JavaScript Frameworks for Headless WordPress 2024 https://wpengine.com/builders/best-javascript-frameworks-for-headless-wordpress-2024/ https://wpengine.com/builders/best-javascript-frameworks-for-headless-wordpress-2024/#respond Thu, 13 Jun 2024 14:18:24 +0000 https://wpengine.com/builders/?p=31646 WordPress is widely recognized as a leading Content Management System (CMS), providing an excellent content authoring and editing experience. Meanwhile, building web experiences using JavaScript frameworks has never been more […]

The post Best JavaScript Frameworks for Headless WordPress 2024 appeared first on Builders.

]]>
WordPress is widely recognized as a leading Content Management System (CMS), providing an excellent content authoring and editing experience. Meanwhile, building web experiences using JavaScript frameworks has never been more popular and can provide more flexibility in the way pages are rendered, performance gains, enhanced security, and other capabilities. Thankfully, it’s possible to combine the two with a headless WordPress approach to get the best of both worlds.

In this article, we’ll explain the concept of headless WordPress and its advantages. We’ll also discuss key features to look for in a framework and present the top five frameworks to help you get started.

Headless WordPress Defined

Headless refers to a CMS that separates its backend content management (the “body”) from the frontend presentation layer (the “head”). In a typical WordPress setup, the backend includes the admin panel and content tools, while the front end is what visitors see.Many websites, like Taylor Swift’s Time Magazine Woman of the Year site, utilize WordPress as a headless CMS. This way, the WordPress dashboard can be used solely for content management while employing a different frontend stack for content display. The decoupled frontend and backend can “talk” to one another via the WPGraphQL plugin, which turns your WordPress install into a GraphQL server and allows it to interact with other applications. Alternatively, the WordPress REST API can be used for this purpose.

The Benefits of Using Headless WordPress

Using headless WordPress offers several advantages, including

Enhanced Performance: By decoupling the frontend from the backend, you allow WordPress to focus solely on the content database and API interactions. This setup can lead to faster load times and improved responsiveness, as the API-driven approach optimizes data delivery and reduces overhead.

Omnichannel Experiences: Without a single frontend, headless WordPress can seamlessly integrate with various platforms and technologies. This means you can publish and display content across multiple channels simultaneously, such as websites, mobile apps, and even digital kiosks, allowing for a unified and versatile content distribution strategy.

Improved Security: Headless setups, especially static sites, eliminate traditional backend vulnerabilities since there’s no exposed database. This significantly reduces the risk of common security threats associated with WordPress, such as SQL injection attacks, making your content safer from potential breaches.

It’s important to note that headless WordPress requires a solid understanding of web development. Additionally, managing and maintaining a headless setup can be more complex than a standard WordPress installation, as it involves handling both the content management system and the separate frontend technology stack.

Things To Consider with a JS Framework in 2024

Implementing headless WordPress can be complex, but frameworks provide the necessary tools to help developers efficiently build frontend platforms. While these frameworks are typically based on JavaScript, they also utilize CSS and HTML.

When selecting a framework, key features to consider include:

Static Site Generation (SSG): Ideal for creating static websites with pre-generated pages, SSG allows for faster loading times since HTML, CSS, JavaScript, and other assets are ready for the browser to consume immediately.

Server-Side Rendering (SSR): For scenarios where SSG isn’t feasible, SSR enables web pages to be rendered on the server before being sent to the user’s browser, enhancing performance and SEO.

Simple Data Fetching: Opt for frameworks that offer straightforward data fetching mechanisms to keep queries from your WordPress backend simple and efficient.

Minimal Configuration: Frameworks that require minimal setup and configuration can accelerate the development process, allowing you to focus on building your site rather than managing settings.

Core Web Vitals: Ensure the framework includes features that help optimize your site for Google’s Core Web Vitals, which measure page loading times, interactivity, and visual stability.

Considering these points will help you select a framework that best suits your needs. Each toolkit offers unique functionalities, and most frameworks have active online communities where you can seek support and advice.

Best Frameworks for Headless WordPress

1. Faust.js

Faust.js is a comprehensive framework built on top of Next.js designed for building frontend applications for headless WordPress sites. It aims to alleviate the challenges associated with headless WordPress development, providing a seamless and enjoyable experience for both developers and content creators.

Out of the box, Faust.js offers several essential features:

  • Post Previews: Allows content editors to preview posts before publishing, ensuring accuracy and consistency.
  • WP Template Hierarchy Replicated in JavaScript: Mirrors the familiar WordPress template hierarchy within a JavaScript environment, simplifying the transition for developers.
  • Authentication: Provides built-in authentication mechanisms, streamlining the process of managing user access and security.
  • Decoupled Blocks: Faust abstracts concepts like block discovery, customization, and rendering, working with WPGraphQL for Content blocks to query Gutenberg blocks.

These core features are crucial for any headless WordPress site and can be challenging to implement independently. Faust.js simplifies these tasks, enabling you to focus on building your site rather than configuring basic functionalities.

Furthermore, Faust.js integrates the Apollo Client GraphQL library. Utilizing Apollo Client with WPGraphQL in a headless WordPress setup can cache data and improve performance while providing an exceptional developer experience. This combination makes Faust.js a powerful tool for creating modern headless WordPress web applications.

Note that because Faust is built on top of Next.js, it benefits from all the features listed in the Next.js section below, as well.

2. Next.js

Next.js is a React framework for building full-stack web applications. You use React Components to build user interfaces, and Next.js for additional features and optimizations.  Under the hood, Next.js also abstracts and automatically configures tooling needed for React, like bundling, compiling, and more. This allows you to focus on building your application instead of spending time with configuration.

Key features of Next.js / App Router include:

Static Site Generation (SSG): Enables the creation of static pages that are pre-rendered at build time, leading to faster load times.

SEO Optimization: Next.js has a Metadata API that can be used to define your application metadata (e.g. meta and link tags inside your HTML head element) for improved SEO and web shareability.

Server-Side Rendering (SSR):  Provides the ability to render pages on the server for each request, ensuring up-to-date content and better performance for dynamic sites.

ISR: Incremental Static Regeneration, enables you to update static content without needing to rebuild your site completely. This feature is particularly useful for large websites with frequently changing content, ensuring that users always see the most current information without compromising performance.

API Routes: Allows the creation of serverless API endpoints within your Next.js application, simplifying the process of handling backend logic and data fetching.

File-Based Routing: Uses a file-system-based router, where folders and the files within are used to define your app’s routes.

3. Nuxt 

Nuxt is an open-source framework that makes web development intuitive and powerful.

Create performant and production-grade full-stack web apps and websites with confidence.

Key features of Nuxt.js include:

Universal Mode (SSR): has server-side rendering (SSR), enabling your application to render pages on the server before delivering them to the client. This improves SEO and performance, especially for dynamic content.

Static Site Generation (SSG): With Nuxt’s `nuxt generate` command, you can pre-render your site into static HTML. This is ideal for performance and security, as static sites are faster to load and less vulnerable to attacks.

Vue Components and Ecosystem: Leveraging the power of Vue.js, Nuxt.js allows you to build complex user interfaces with reusable components. The extensive Vue ecosystem, including Vuex for state management, enhances the development experience.

Automatic Code Splitting: Nuxt.js automatically splits your code by page, which optimizes loading times by only loading the necessary JavaScript for the current page.

Module System:  Nuxt.js boasts a robust module system, allowing easy integration of functionalities like PWA support, Google Analytics, and more. Modules simplify the addition of features without extensive configuration.

SEO-Friendly: Nuxt.js offers built-in support for meta tags and structured data, ensuring your site is optimized for search engines without requiring additional plugins.

File-Based Routing:  Nuxt.js uses a file-based routing system that mirrors your project’s directory structure, making it intuitive to define and manage routes.

4. SvelteKit

SvelteKit is an innovative framework built on Svelte, designed to provide a seamless development experience for building web applications. 

Key features of SvelteKit include:

  • Truly Reactive: SvelteKit leverages Svelte’s truly reactive nature, allowing for highly efficient and straightforward state management. This results in a more intuitive and performant development process compared to traditional frameworks.
  • Static Site Generation (SSG): SvelteKit supports static site generation, enabling you to pre-render your site at build time. This results in faster load times and improved SEO, as the static pages are optimized for performance and search engine indexing.
  • Server-Side Rendering (SSR): SvelteKit provides first-class support for server-side rendering, allowing pages to be rendered on the server and delivered to the client. This enhances performance and ensures that dynamic content is always up-to-date.
  • ISR: SvelteKit provides the performance and cost advantages of prerendered content with the flexibility of dynamically rendered content with support for ISR.
  • Simplified Data Fetching: SvelteKit streamlines data fetching through its built-in “load” functions. These functions allow you to fetch data at the page level, making it easy to manage and render data from your WordPress back end.
  • Routing and Navigation: SvelteKit features a file-based routing system, similar to Nuxt.js and Next.js, that simplifies route management. Additionally, SvelteKit provides client-side navigation for a smooth, SPA-like experience.
  • Hot Module Replacement (HMR): SvelteKit includes hot module replacement, which speeds up the development process by allowing you to see changes in real-time without a full page reload.
  • Progressive Enhancement: SvelteKit ensures that your application works even with JavaScript disabled, providing a basic but functional version of your site. This approach enhances accessibility and user experience across different devices and network conditions.
  • Built-In Adapters: SvelteKit supports various deployment targets through built-in adapters, such as Vercel, Netlify, and Cloudflare Workers. This simplifies the deployment process and allows you to choose the best hosting solution for your needs.

5. Astro

Astro is a modern, innovative framework designed for content-driven websites.

Key features of Astro include:

  • Islands Architecture: Astro introduces an “islands architecture,” where static content is pre-rendered by default, and JavaScript is only loaded for interactive components. This approach significantly enhances performance by reducing the amount of JavaScript that needs to be processed on the client side.
  • Static Site Generation (SSG): Astro excels at static site generation, enabling you to pre-render your entire site at build time. This results in extremely fast load times and improved SEO, as static pages are optimized for performance and search engine indexing.
  • Partial Hydration: With Astro’s partial hydration feature, only the necessary JavaScript for interactive components is loaded, further optimizing performance and user experience. This ensures that your site remains fast and responsive.
  • Framework Agnostic: Astro is framework-agnostic, allowing you to use components from Astro itself, React, Vue, Svelte, Solid, and more within the same project. This flexibility enables you to leverage the best tools for different parts of your application.
  • Simplified Data Fetching: Astro provides a straightforward data fetching mechanism, allowing you to easily pull data from your WordPress back end using WPGraphQL. This simplifies the process of integrating dynamic content into your static site.
  • Automatic Image Optimization: Astro includes built-in support for automatic image optimization, ensuring that images are served in the most efficient format and size. This further enhances performance and load times.
  • Markdown and MDX Support: Astro supports content authoring with Markdown and MDX, making it easy to create and manage content-rich pages. This feature is particularly useful for blogs and documentation sites.
  • File-Based Routing: Astro uses a file-based routing system that mirrors your project’s directory structure, simplifying route management and navigation.
  • SSR and Hybrid Rendering: While primarily focused on static site generation, Astro also supports server-side rendering (SSR) and hybrid rendering approaches. This allows you to combine the benefits of static and dynamic content as needed.
  • Built-In Integrations: Astro offers a wide range of built-in integrations, including support for popular CMSs, deployment platforms, and other tools. This extensibility makes it easy to add functionality and streamline your development workflow.

Where To Host

There are various platforms available for hosting your frontends and WordPress installations. However, the best-managed platform that provides optimized hosting for both your frontend and backend is WP Engine’s headless platform. It offers an all-in-one solution for building radically fast sites. WP Engine handles all the optimizations, configurations, CDNs, DevOps, git connections and more, ensuring your WordPress backend is always up to date and performing well.

 If you want to give it a try for yourself, please check it out here.

Summing It All Up

In conclusion, many of these frameworks share similar capabilities and features. Ultimately, the choice depends on your team and developers’ preferences for specific features and the development environment they enjoy working in. Additionally, consider the particular use cases for which you are building a headless WordPress application.

Whatever framework you decide to choose, remember that new JavaScript frameworks will continue to emerge, offering even more options. 🤣

But seriously, whatever you decide to go with, we would love to hear about your experiences and feedback with that particular framework using headless WordPress. So hit us up in our Discord and come join the headless stoke!

The post Best JavaScript Frameworks for Headless WordPress 2024 appeared first on Builders.

]]>
https://wpengine.com/builders/best-javascript-frameworks-for-headless-wordpress-2024/feed/ 0
ISR Support for Next.js/Faust.js  on WP Engine’s Atlas https://wpengine.com/builders/isr-support-for-next-js-faust-js-on-wp-engines-atlas/ https://wpengine.com/builders/isr-support-for-next-js-faust-js-on-wp-engines-atlas/#respond Mon, 03 Jun 2024 20:39:50 +0000 https://wpengine.com/builders/?p=31636 WP Engine’s Atlas is THE headless WordPress hosting platform.  In this article, I will discuss and guide you through the easy implementation of the latest feature on Atlas:  Support for […]

The post ISR Support for Next.js/Faust.js  on WP Engine’s Atlas appeared first on Builders.

]]>
WP Engine’s Atlas is THE headless WordPress hosting platform.  In this article, I will discuss and guide you through the easy implementation of the latest feature on Atlas:  Support for ISR on Next.js/Faust.js.  By the end of this article, you will have a better understanding of ISR and using Atlas to support it with Next.js/Faust.js

Prerequisites

Before reading this article, you should have the following prerequisites checked off:

  • Basic knowledge of Next.js and Faust.js.
  • An Atlas account and environment set up.
  • Node.js and npm are installed on your local machine.

If you do not and need a basic understanding of Next.js and Faust.js, please visit the docs:

https://nextjs.org/docs

https://faustjs.org/tutorial/get-started-with-faust

What is ISR?

Incremental Static Regeneration (ISR) is a feature introduced in Next.js that allows you to update static content after it has been deployed. Unlike traditional static site generation, which regenerates all pages at build time, ISR enables you to regenerate individual pages on a timed interval as new requests come in. This ensures that your site remains highly performant while still delivering up-to-date content to your users.

Why Use it in headless WordPress?

In headless WordPress, the front end is decoupled from the WordPress backend, often using Next.js and Faust.js to render the website. This architecture offers several advantages, such as improved performance, enhanced security, and greater flexibility in choosing front-end technologies.

However, one challenge with headless WordPress is ensuring that content changes in WordPress are reflected on the front end without sacrificing performance. This is where ISR becomes crucial. By leveraging ISR, you can achieve the following benefits:

Up-to-date Content: ISR allows your site to fetch the latest content updates from WordPress at specified intervals. For example, by setting a revalidation time of 60 seconds, Next.js will check for content updates every 60 seconds. When a user visits a page, the first user will receive stale content, but subsequent requests will serve the updated content, ensuring minimal delay in content updates.

Enhanced Performance: Since ISR updates only the specific pages that need regeneration at set intervals, your site remains fast and responsive. The initial load times are minimized, and only the changed content is updated, reducing the server load and build times.

SEO Benefits: Static pages are highly favored by search engines due to their speed and reliability. With ISR, you maintain the SEO advantages of static generation while ensuring that your content is always fresh and relevant.

Scalability: ISR enables your site to handle large volumes of content efficiently. Whether you’re running a blog with frequent updates or an e-commerce site with dynamic product listings, ISR ensures that your site scales seamlessly.

All those benefits got me stoked!  Let’s get it on our Next.js and Faust.js sites!

Configuring Next.js with Atlas ISR Support

Here are the docs link to the Atlas support for ISR.

In your Next.js project, go to your terminal and install the @wpengine/atlas-next package:

npm install --save @wpengine/atlas-next

This package provides improved support on Atlas.

Once you install it, ensure it is in your project by navigating to your package.json file at the root of your project:

"dependencies": {
   "@wpengine/atlas-next": "^1.1.0",
   "autoprefixer": "10.4.14",
   "eslint": "8.44.0",
   "eslint-config-next": "13.4.9",
   "next": "14.1.2",
   "postcss": "8.4.25",
   "react": "18.2.0",
   "react-dom": "18.2.0",
   "tailwindcss": "3.3.2"
 }

Now that you have verified the proper installation, staying at the root of your project,  modify your next.config.js file like so:

const { withAtlasConfig } = require("@wpengine/atlas-next");


/** @type {import('next').NextConfig} */
const nextConfig = {
 // Your existing Next.js config
};


module.exports = withAtlasConfig(nextConfig);

Faust.js Wrapper

If you are using Faust.js, all you need to do is modify your next.config.js file using the withFaust wrapper:

const { withFaust } = require("@faustwp/core")
const { withAtlasConfig } = require("@wpengine/atlas-next")

/** @type {import('next').NextConfig} */
const nextConfig = {
  // Your existing Next.js config
}

module.exports = withFaust(withAtlasConfig(nextConfig))

Next, we need to verify that it works.  Run your app in dev mode via npm run dev and you should see this output in your terminal:

Stoked!!! It works!!!

Atlas User Portal

We now have ISR set up with the proper configuration. The last steps are to connect our remote repository to Atlas, git push any changes, and observe ISR working in all its cache invalidation glory.

If you have not connected your local project to a remote repository, go ahead and do so.  Atlas supports GitHub, Bitbucket and GitLab.

Once you have connected your remote repository and added all your necessary environment variables, go ahead and build the app. If you have done this with an existing repo, you can git push the change, which will trigger a build.

When the application is finished in the build step, you are on the main page of the Atlas portal.  Navigate over to the Logs subpage:

In the Logs subpage, click the “Show logs” button on the Runtime option:

You should see the same output focusing on line 6 as you did in your terminal to ensure it’s working properly:

Awesome!!! It is implemented and working in runtime.  Now, when you edit or input new content in your WP backend, and then visit the live URL of the Atlas site you just deployed, the ISR should work on the timed interval you set it to like so:

Limitations

Just a note, the docs state that this feature is currently in the Beta phase, which entails:

  1. Functional completeness, offering comprehensive support for Next.js Incremental Static Regeneration.
  2. Ongoing assessment by Atlas Platform teams regarding the feature’s effect on website performance and application scalability.

Conclusion

Implementing Incremental Static Regeneration (ISR) with Next.js and Faust.js on WP Engine’s Atlas platform is a game-changer for maintaining performance and up-to-date content in a headless WordPress setup. By following the steps outlined in this guide, you can leverage ISR to ensure your site remains both fast and current, without the need for full rebuilds. 

The integration with Atlas also simplifies the deployment and management process, providing a seamless workflow from development to production. 
Get stoked on ISR and Atlas to deliver awesome user experiences and keep your site at the edge of web performance and content freshness.  As always, we look forward to hearing your feedback, thoughts, and projects so hit us up in our headless Discord!

The post ISR Support for Next.js/Faust.js  on WP Engine’s Atlas appeared first on Builders.

]]>
https://wpengine.com/builders/isr-support-for-next-js-faust-js-on-wp-engines-atlas/feed/ 0