HTTP Request Log Shipping on VIP Go

VIP Go platform specific

This document is for sites running on VIP Go.

Learn more

Overview #

VIP’s Log Shipping feature allows you to automatically save HTTP request logs to an Amazon Web Services S3 bucket at 5-minute intervals. The logs are then available to your team and contractors for storage, process, or analysis. Logs are an important asset in understanding the use of your system, connectivity issues, performance tuning, usage patterns, and in analysing service interruptions.

Currently we only provide Log Shipping for your HTTP (web) Request logs.

↑ Top ↑

Requirements #

You will need:

  • An AWS S3 bucket, make a note of the the bucket name and region
  • Access to create/update the AWS Bucket Policy configuration for the bucket

Currently, enabling this feature (in beta) requires assistance from the VIP Support team, so please open a support ticket if you would like to enable it.

↑ Top ↑

Configuration #

1. Get the name of your AWS bucket and region.

2. Enter it into the dashboard under Settings > Log Shipping

configure-log-shipping-in-VIP-dashboard

3. The dashboard will generate a config file in JSON format that you need to paste into your AWS Bucket Policy configuration. For the desired bucket, navigate to “Permissions,” then select “Bucket Policy.” The JSON file can be saved there.

4. Once the configuration information is entered into the dashboard, a test file will be sent to the bucket. Note that a test file is uploaded as part of the verification process, aptly named vip-go-test-file.txt. This file will always be present in a sites configured bucket and path, alongside the date folders that contain the logs themselves.

The path used to write to the bucket is [bucket]/[app_name]/[app_environment], e.g. my-log-bucket/my-app/production. This means that you can use the same bucket for more than one app or environment, should you choose to do so.

↑ Top ↑

Viewing and utilizing the data in your logs #

The log files are written as a series of gzipped JSON files. Here is a sample record:

{
  "client_site_id": "000",
  "remote_user": "",
  "request_url": "/",
  "wplogin": "-",
  "timestamp": "19/May/2020:17:03:58 +0000",
  "request_type": "GET",
  "scheme": "https",
  "http_referer": "https://example/",
  "http_x_forwarded_for": "",
  "true_client_ip": "",
  "remote_addr": "REDACTED",
  "tls_version": "TLSv1.3",
  "content_type": "text/html; charset=UTF-8",
  "upstream_country_code": "GB",
  "sent_cache_control": "max-age=300, must-revalidate",
  "timestamp_iso8601": "2020-05-19T17:03:58+00:00",
  "sent_vary": "Accept-Encoding",
  "sent_x_cache": "hit",
  "request_time": "0.001",
  "http_host": "example.com",
  "http_accept_language": "en-US,en;q=0.9",
  "http_user_agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.138 Safari/537.36",
  "http_version": "HTTP/2.0",
  "body_bytes_sent": "8981",
  "status": "200"
}

The JSON formatted log files are readable individually by humans, but to make full use of your logs you will need to ingest them into another service. Here are some examples platforms that will help you make the most of your data, depending on your use cases:

  • ELK (Elasticsearch, Logstash, Kibana) will help you filter and view your logs
  • Splunk will help you search, monitor, and analyse the data from your logs
  • Data Dog will help you understand development issues within your logs
  • Botify will help you understand SEO issues revealed by your log data

Ready to get started?

Drop us a note.

No matter where you are in the planning process, we’re happy to help, and we’re actual humans here on the other side of the form. 👋 We’re here to discuss your challenges and plans, evaluate your existing resources or a potential partner, or even make some initial recommendations. And, of course, we’re here to help any time you’re in the market for some robust WordPress awesomeness.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.