Proxy / Cache: A faster local environment
Apiumhub
Posted on March 5, 2020
Sometimes, some projects have a local development environment that’s not completely isolated and depends on some infrastructure that we cannot locally raise, be it because of license issues, time constraints, etc.
In my case, this dependency is a service that receives numerous API calls, each taking some time to respond, and since we don’t have the pertinent license we cannot test it in local.
Since we don’t want to disclose client data, we’ll use as an example this page that’s full of cats: https://http.cat/.
json-caching-proxy
Searching, I found an npm library to cache these API calls, and that will only take some time on the first occasion a request is done. There’re plenty of libraries that can do this. I chose this next one, it seems to work pretty well https://www.npmjs.com/package/json-caching-proxy.
npm install -D json-caching-proxy
Once installed, we’ll create a main.js file where we’ll add the next piece of code, which is all that we’ll need to raise a server up that will listen to requests and redirecting them to the real page.
const JsonCachingProxy = require('json-caching-proxy');
const proxy = new JsonCachingProxy({
"remoteServerUrl": "https://http.cat/",
"proxyPort": 8080,
"cacheEverything": true,
"cacheBustingParams": ["_", "dc", "cacheSlayer"],
"showConsoleOutput": true,
"dataPlayback": true,
"dataRecord": true,
"commandPrefix": "proxy",
"proxyHeaderIdentifier": "proxy-cache-playback",
"proxyTimeout": 500000
});
proxy.start();
This example is taken from the library’s own page, we’ve just replaced the remoteServerUrl parameters with the domain we want to cache, in our case https://http.cat/, and the proxyPort with the 8080 port, since we want to listen to the requests done to our local host.
On the library’s page we can find different configurations like:
- how to keep this cache between sessions with the inputHarFile parameter, since in this example all data will be deleted once we stop the server.
- how to cache only certain requests that go through a particular route.
- how to exclude certain requests with the excludedRouteMatchers parameter.
- etc.
Next, we raise the server with this command:
node main.js
The server will be left listening for potential requests to our local host through the 8080 post. The console will show this:
JSON Caching Proxy Started:
==============
Remote server url: https://http.cat/
Proxy running on port: 8080
Proxy Timeout: 500000
Replay cache: true
Save to cache: true
Command prefix: proxy
Proxy response header: proxy-cache-playback
Cache all: true
Cache busting params: _,dc,cacheSlayer
Excluded routes:
Listening...
Now we can access the page through our local host:8080 server, which will be in charge of redirecting requests towards https://http.cat/.
If we take a closer look at the messages appearing on the console we can see all the requests we’re doing. Yellow messages indicate a first time accessing the resource, and so we’re redirecting this request to the real page and storing the response in cache, so next time it’s consulted we can skip the waiting time of the real API call.
Now we can normally navigate the website while our server keeps storing all requests in cache.
Meanwhile, the green messages indicate that the resource has been accessed previously, and so the request isn’t done to the real page and we’re obtaining the responses directly from our cache.
Charles
In order to visualize/debug the results, we can use applications like charles or burp. In our example, we’ll use Charles.
We’ll activate Charles’ Reverse Proxies option.
And add this line:
This will make Charles intercept all API calls to local host:80 and redirect them to our proxy-cache server.
Now we can directly access from “local host:80”.
And examine the results of each of our requests.
The post Proxy / Cache: A faster local environment appeared first on Apiumhub.
Posted on March 5, 2020
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.