Rick Delpo
Posted on December 10, 2021
This year I went Serverless using AWS Lambda and S3 as my data table. OMG, I ditched Java and JQuery in favor of Lambda !
Old stack | New stack | |
Front end | JQuery and Ajax | Plain Vanilla Javascript and Axios |
Logic Layer | Java Servlet | AWS Lambda |
Back end | MySQL | S3 |
Note, we are in a Windows environment and not using Node.js
I Love Lambda but don't especially love the learning curve to get a full app going. Spoiler alert, U can test some simple lambda right inside lambda to drastically reduce the learning curve or u can write a full stack app which is what I am doing.
Recently (2021) I was playing around with React. I needed a change because I am an old dinosaur and have been using JQuery for years. Then soon in I realized this Serverless craze out there in Dev Land. Since I already had an Apache24 instance on AWS EC2 I started getting curious about how I could use ONLY Lambda and S3 so I could eliminate EC2 and my clunker database.
I dug up lots of Lambda tutorials and immediately found out how to manipulate a Dynamo NoSQL database on AWS. This NoSQL idea started nagging me because I am very old school and have been using MySQL for ions. I was also playing around with Axios on the front end as my http client.
Then it dawns on me that data is always rendered in JSON and that NoSQL is also JSON. But using Dynamo was getting to me so I thought I would cut to the chase and Do NoSQL JSON directly in S3. Why not make S3 my database? I thought. There are more reasons not to do this than to do, but I am experimenting here.
I also ran into rendering problems in React. Array.reduce was taking too long on large tables. Hellooo….introducing Latency and Cold Starts. At first I got real discouraged with Lambda because of Latency but I found a way around it all and my SQL background helped immensely here. I discovered that Lodash was a more user friendly way to write reduce code. Objective here is to have a nice reduced view of my data, a dashboard view so to speak. A much condensed view of the data. But to do all this with Dynamo became unwieldy and complex, not to mention pricey. So I thought of a down and dirty S3 approach where I insert new data, save, fetch, reduce and save again all in the background with a small Lambda program.
Since I was new to Lambda this project seemed very daunting at first. But I was not new to ad hoc requests from my team mates at work. When they request a certain view of the data they simply want it on a dashboard and they never seems to care how I make this magic work. They just want it and they want it now. Since many of these ad hoc requests most always only involve 1 table that constantly gets updated and re rendered I figured S3 and Lambda would do the trick, and it turns out that I was right about this.
At the same time, though, I was finding React kind of complicated so I opted for just a plain vanilla javascript front end. Excuse me all of u out there but sometimes, based on the requirement, we just need to break some rules. Also PS I was using react without Node anyway with the Babel library. There was another insidious thing going on too. Babel was setting a cookie and i just spent the last 2 years trying to be cookie free.
By now folks, I am sure u just want to see some code so I can show u my Lambda right now
//demo of passing javascript params into s3 json file, acting as my database
//this is a Node.js Lambda
//we need to upload 2 modules, Lodash and node-fetch
const AWS = require('aws-sdk');
const fetch = require('node-fetch');
const lodash = require('lodash');
const s3 = new AWS.S3();
//note : 2 node modules were ziped into same folder called nodes2.zip using 7zip, then imported
exports.handler = async (event) => {
const res = await fetch('https://rickd.s3.us-east-2.amazonaws.com/tracker2.json'); //get current working json file in s3
const array = await res.json();
//pass user geo javascript variables into environment
array.push({
country: event.country2,
session: event.ses,
page_name: event.date2,
hit: event.hit2,
ip: event.ip2,
time_in: event.time2,
time_out: event.time3
});
//S3 bucket
var params = {
Bucket: 'rickd',
Key: 'tracker2.json',
Body: JSON.stringify(array), //pass fetch result into body with update from push
ContentType: 'json',
};
//save above data back to s3 json file
var s4Response = await s3.upload(params).promise();
//reduce above array into a new view using lodash
var result = lodash(array)
.groupBy('country')
.map((key, country) => ({
country: country,
unique_users: lodash.uniq(lodash.map(key, 'ip')).length,
hits: lodash.size(key, 'hit') //using size reduces result to one val
}))
//pass result into sort descending function
let sortedInput = result.slice().sort((a, b) => b.unique_users - a.unique_users);
//this time bucket params passes lodash into new json file called view.json
var params2 = {
Bucket: 'rickd',
Key: 'view.json',
Body: JSON.stringify(sortedInput), //pass fetch result into body
ContentType: 'json',
};
//save reduced data back to s3 view.json file
var s3Response = await s3.upload(params2).promise(); //save to s3
//return sortedInput; //only return this if testing otherwise not needed
};
Then there are all the little caveats and obstacles u need to know about how to actually get this all up and running in AWS.
Examples of gotchas and caveats
Hint
1 In Lambda we need to import 2 node libraries
2 we also need an api gateway in AWS to provide a link to Lambda
3 we need a json file in S3 and need to know how to write to this file
4 beware of AWS IAM permissions...as they are always a problem for beginners
5 Newsflash!! BTW we also need AWS style SSL and a Cloudfront distribution
6 and DNS pointed to AWS route53…..had enough yet?
Hold on there pardner !!! way too much information...there is no way I can teach all of this without going on a massive tangent and it is overwhelming at first but we gain much satisfaction after crossing this hurdle because it catapults us right into modern day 2020s Serverless Knowhow.
Remember this is all just to get some Lambda going. But don’t u as a developer want to be cutting edge? I do and for me this is all in the rear view mirror now. Take a deep breath and convince urself that this endeavor is a must do.
BTW…..all the above is on a Windows platform..I don’t have a clue about doing this in Linux or whatever else is out there.
Now for my front end code
//Frontend, this is just a snippet of my code.....At some point I can provide the whole Tutorial
passVars(); //run passVars then on each pg click run passVars again to capture pg name
function passVars() {
//then pass 5 objects... ses, hit2 etc to lambda and call event.ses inside lambda..note, object must be called article
//passing my geo vars from jsonp
const article = { ses: session, city2: city, hit2: date, date2:pg_name, ip2:ip_address,country2:country, time2:now, time3:out };
axios
.post("https://xx8pchcrt4.execute-api.us-east-2.amazonaws.com/default/lodash2",article)
.then(res => {
//console.log(res.data);
});
//this axios is for user time in....see onpopstate for user time out...this is all part of my Geo Tracking app
}
More to come soon !
Original content can be found at https://howtolearnjava.com
Posted on December 10, 2021
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.