Don't like this style? Click here to change it! blue.css

LOGIN:
Welcome .... Click here to logout

AJAX and REST APIs

(Data Layer part 2)

OK so we left off with localStorage to persist tasks between sessions.

I'm tempted to jump right into my favorite way to persist data between your users and not just locally (using Firebase).

But I'll do some fundementals first, they are like this:

AJAX

AJAX (Asynchronous Javascript and XML) revolutionized websites in 2004 when gmail showed everyone just what was possible with a new paradigm.

The idea is this: Loading a webpage is expensive, lots of libraries, images, videos, authentication sessions, etc. So DON'T RELOAD PAGES, if you need dynamic data then use small requests in the background of a page.

Alright here we go, here's an example of an AJAX call:

See the Pen JokeAPI jQuery GET by Andy Novocin (@AndyNovo) on CodePen.

Now I'm actually going to show 2 other versions of the same exact app to help you understand what is the history, what is the future, and help you see the big picture.

See the Pen JokeAPI XHR example GET by Andy Novocin (@AndyNovo) on CodePen.

Now for the most modern version fetch:

See the Pen JokeAPI fetch example GET by Andy Novocin (@AndyNovo) on CodePen.

Now find an API on the internet for getting the UTC time, and adjust these examples to give the UTC time whenever the button is clicked.

REST APIs

APIs and The layered cake

The quintessential HTTP practical REST API

The History

History

When computers were so expensive that a large company could only afford one, we had systems for maximizing utility: mainframes and 'terminals'

In essence each person was using the same exact machine even if there were many places to access it.

That mainframe was the oracle that had everything, the data, the UX, direct access.

As personal computers began to make things cheaper, people actually networked to the mainframe and SSHed into the mainframe (because we had the paradigm of terminals). This widened the playing field of people who could use computers effectively so we start to abstract away the terminal into the background and give applications on the personal computers that managed the data access from the mainframe.

Switching out the mainframe or databases for a company was a very difficult process in that world. It was needed to keep up, performance wise, and must halt progress and put the company at risk. In the meantime the networks inside of a company (designed to give employees access to the data) started to connect to each other (the internet (between the networks)).

As the vision of what the internet could mean began to take hold we were had to standardize around protocols to allow strangers to work together.

I believe that the concept of internet APIs emerges onto the scene at the height of Object-Oriented Programming's rise to popularity.

APIs are all about the separation of concerns

In good OOP you Code to an interface, not an implementation.

The goal is to reduce "strong coupling" between the parts of your system. The less you need to know in order to use a service the better.

If you code to an interface your team can work in parallel, parts of the system can be completely reworked, you need less meetings, you can service a wider range of clients, code gets cleaner so less bugs.

REST comes from a PhD thesis so it has a more abstract intention than how it was really used. BUT at it's heart it is THE meta interface for data. Since we all know now to code to interfaces REST API came on to the scene as the right mix of a simple, scalable, pleasant, and implementation-agnostic way to get access to your data on the internet.


Engineering your backend with REST APIs

REST is a convention that helps you build a simple interface to your data. When done well most client-side developers will immediately understand how to use your services. So it allows many different apps to be built from the same server and often even community-built services. You can even swap out databases without any client-side code needing to change.



CRUD as your guide

The central four functions of a data-driven app (pretty much all of them) are:

When planning out a web app you have your skeleton in place once you can do these four things.

REST as CRUD

REST stands for Representational state transfer and there are purists which will be happy to debate you on the RESTfulness of your design. The philosophy behind REST is: Make stateless, simple, uniform client interfaces for server resources.

A REST API has two parts to it. One is a way of mapping URLS to your resources. The other is mapping the HTTP methods to your actions.

Mapping URLS to resources

So think about the objects in your app. It might be students and courses or users and purchases but they typically align with database tables.

I tend to think in terms of collections and identifiers which are nested.

As an example, suppose you have users, each user has a unique username, and a collection of permissions then I might return those values at the following URLS: (the :attribute notation is a way of indicating a variable)

Once you've got a URL which represents a particular collection or object (a resource) then we can map the HTTP verbs to actions on that resource:

Suppose I wanted to keep many users, each with an id. I could imagine the following RESTful interface/contract/agreement:

      
  POST /user
  (create a user return their id)
  GET /user/:user_id
  (get information about user # :user_id )
  PUT /user/:user_id
  (alter the information about user :user_id)
  DELETE /user/:user_id
  (destroy user :user_id)
      
    

If that is my contract then if my API is hosted at http://myapi.com someone putting the URL http://myapi.com/user/2 should see data about user number 2.

REST in the wild: Find the Twitter REST API. How is it structured? What sort of resources do they grant access to? How about Reddit, do they have a REST API? What does it show you about their data stores?

NODE and Express

Javascript's popularity is very connected to NodeJS which runs JavaScript on servers.

Node as scripting

Today's educational gift is NodeJS. Today we'll take our fledgling JavaScript skills to the server.

First Server: Head to glitch.com, make a new "Hello, Node" project, edit the file named server.js and give it the one line console.log("starting server");. Now under tools open your terminal and run your "server" with the line node server.js

Node is just a high speed javascript engine that runs on the server. At this stage it is more like Python than a proper server. In fact let's demo that now.

Node as server

To get a script to run in response to AJAX/HTTP requests we need a process to wait and listen.

First simple server: Edit your server to look like the following snippet. Now visit the magic URL (Click Show in New Window).

So now that we can handle responses and write requests the skeleton is in place to build server-side features. The rest is frosting (well at least to taste).

Variable URLs: take a look through the request object and find the attributes which will show you the HTTP verb and the requested resource. Console.log those instead and restart node. Now visit your magic URL + /bananas/are/great.

Use the tools you've got: Now try to respond to requests of type /users/:userid with the reaction <p> You requested data on user number USERID_HERE</p>


Express: the routing engine

The next part of a MEAN stack to master is Express. Express is a library for routing HTTP requests and it makes the launch of a REST api pretty painless.

Express is a 3rd party app and some tutorials you find might be about Express version 3 not Express version 4, so keep an eye out.

To include express on glitch.com: click package.json then click Add Package search for express and add it.

Create this server: the following snippet has a simple express driven server to test out. Notice the difference when you head to MAGICURL/ vs MAGICURL/bananas/are/great

The role that express plays is to make your last task easier, reacting to particular URLs and particular verbs on those URLS is now as simple as router.get, router.post, router.put, etc.

Now one thing which is different about working with modern javascript is that most listeners use callback functions. These callback functions are what allow large amounts of traffic to be processed by a simple machine. Requests are instantly handled and the callback is registered. The computation will happen when there is time to deal with it. PHP would just hang if you get too much traffic or too long of a computation.

URL parameters: the express engine deals with REST parameters in the nicest way possible really. Install the following server which does the earlier /users/:userid task in express style.

The Express API is at http://expressjs.com/en/4x/api.html

Follow the above install steps to add body-parser

Use the following server.js to host static files and handle POST data.

In the public folder add a file index.html and put something there.

Note that you are serving static files AND reacting to API stuff.

Read User Data: Head to postman and try to send the key/value pair {"name":"Andy"} to your server. You'll note that you need content-type header for the pure JSON post. x-www-form-urlencoded should just work.

Today's Flag:

https://graceful-four-hummingbird.glitch.me/