How to check Ajax Request and Response in Browser?

Are you developing an Ajax based application or placing some Ajax mechanism in your web pages? And you do not know how to verify the request and response of these Ajax calls.

Well! We shall look into Browser Developer Mode and see the Ajax Request and Response.

Open Chrome Browser and hit the URL of your website that has Ajax calls.

Now Open Developer Tools.

Options (Customize and control Google Chrome) -> More Tools -> Developer Tools.

There will be many tabs. But we use ‘Network’ tab for testing Ajax Calls.

Perform an action that triggers an Ajax Call.

For demonstration, we used Submit Quiz Question to TutorialKart. When you fill in the details and click on Submit, an Ajax Call is sent.

Ajax Request and Response in Browser Developer Tools

How to check the Ajax Request and Response

In the above picture, you see that there is a new entry in the Network Tab with the name admin-ajax.php.

Click on the entry.

Ajax Call Details

Headers Tab

In the headers tab, you have the following sections.

  • General
  • Response Headers
  • Request Headers
  • Form Data

In General section, you see the details like Request URL, Request Method (GET/POST), Status Code (if 200, then its a success), Remote IP Address.

Form Data

If you have done the Ajax Call on a form and sent the serialized form data, you can see the fields sent in the Ajax Call under Form Data section.

Ajax Call Form Data

Any response from the Ajax Call can be seen under Response Tab.

Ajax Call Response

How to display Less Than symbol in HTML Page?


Less than symbol is used in writing html tags. But if you want to display the less than symbol in an HTML page, use &lt;.

&lt; is the named HTML Entity for less than symbol.

You can also use Hex Code: &#x003C; or Decimal Code &#60 to display less than HTML Symbol.



Indian Rupee INR – HTML Hex and Decimal Code

20B9 – HTML Hex Code for Indian Rupee or INR


Following is an example to illustrate how to display Indian Rupee INR Symbol in HTML page using INR Hex Code.

Indian Rupee, INR symbol is ₹

8377 – HTML Decimal Code for Indian Rupee or INR


Following is an example to illustrate how to display Indian Rupee INR Symbol in HTML page using INR Decimal Code.

Indian Rupee, INR symbol is ₹


What is Node.js

Node JS

Node.js is a JavaScript runtime built on Chrome’s V8 JavaScript engine

V8 engine compiles JavaScript code directly to machine code before executing it. It does not use traditional flow with an interpreter or compiler.

Node.js is an asynchronous event driven JavaScript runtime

Asynchronous event driven model: The model in which the processing or execution of a piece of code occurs as and when the event happens, rather than waiting for the previous event to complete.

Consider clientA requested for resource1. While clientA is being servered, another client: clientB has requested for resource2. With traditional server software, clientB is served only after clientA is served.

What if clientA takes a very long time? All the clients that come after clientA has to wait for that long and is not acceptable in realtime.

Which is why Node.js is built as event driven model. While clientA is being served, the control does not wait for the request to be completed. It delegates the work of fetching the resource. Meanwhile it servers other clients. When the requested resource is ready, clientA is served with the resource.

Node.js is open-source

Node.js’ source code, blueprint or design can be used, modified and/or shared under defined terms and conditions. This allows end users and commercial companies to review and modify the source code, blueprint or design for their own customization, curiosity or troubleshooting needs.

Node.js is cross-platform

Node.js is implemented on multiple computing platforms. Node.js can run on Microsoft Windows, Linux, and macOS.

Node.js is written in C, C++, JavaScript

Some parts of Node.js are written in C which need most performance. Node.js C++ implements some other modules and also the integration of V8 engine. JavaScript part helps with the inbuilt modules.

Node.js is licensed under BSD

BSD licenses impose minimal restrictions on the use and distribution of Node.js.

Other Names

Node.js is often called using Node, Node JS, NodeJS.

Node.js Resources

Official Node.js Site

Node.js Tutorial

Node.js Interview Questions

How artificial intelligence could take over jobs

Artificial intelligence is on rise now a days. There is a lot of research going on and many advancements are happening. Artificial intelligence has certainly come to the field of Analytics. There are many jobs based on a set of rules a person has to follow on a daily basis. The artificial Intelligence and cognitive systems have become smart to act based on a set of rules. Based on the latest advancements in machine learning algorithms artificial intelligence applications have come to the level of average IQ human being.

For a single task, although the artificial intelligence applications have reached far beyond the intelligence of human beings, when there is a good mix of tasks artificial intelligence applications are not performing up to the mark. But in sometime which is not very far, these applications with inbuilt intelligence are going to  be replaced in place of human beings.

Many companies are working towards the realisation of these applications with intelligence built within them. Only the leaders in this race are going to survive the competition. Once a leading companies take the lead most of the companies that are performing below the markets average shall dissolve.

What do you know about Apache Spark ?

Apache Spark is an open-source cluster-computing framework. Originally developed at the University of California, Berkeley’s AMPLab, the Spark codebase was later donated to the Apache Software Foundation, which has maintained it since.

Apache Spark’s story

In order to process and analyze huge amounts of data very efficiently, Apache Hadoop saw the need for a new engine called MapReduce. And soon MapReduce has become the only way of data processing and analysis with Hadoop Ecosystem. Being the only one of a kind, it influenced communities to develop new engines to process big data. This led to the evolution of Spark at Berkeley AMPLab. The developers at Berkeley AMPLab decided to take the benefit of already established big data open community. So they donated the codebase to Apache Software Foundation and Apache Spark is born.

What does Apache Spark comprise of ?

Before going into the discussion of what Spark can do, lets have a quick look into what Spark has inside it. Excluding the Spark Core, Apache Spark has four libraries that address four areas. They are :

  1. Spark SQL
  2. Spark Streaming
  3. Spark Machine Learning library (also called as Spark MLlib)
  4. GraphX

What can Apache Spark do ?

Now we know what Spark has, let us see what Spark can do.

  1. Unlike Hadoop, Spark can process data in mini-batches and perform transformations.
  2. With the help of Spark’s distributed machine learning framework, machine learning tasks could run on Spark cluster with commodity hardware.
  3. Similarly, graph processing could also be done using the distributed framework.
  4. Structured and semi-structured data could be processed using SQL component of Apache Spark.

References to learn Apache Spark

If you are interested in learning Apache Spark, here are few of the useful links that will help you get started with. Feel free to get your hands dirty.

  1. Apache Spark Official by Apache Software Foundation
  2. Apache Spark Tutorial by TutorialKart

How Apache Kafka is helping Industry

Apache Kafka is an open-source stream processing platform developed by the Apache Software Foundation written in Scala and Java. The project aims to provide a unified, high-throughput, low-latency platform for handling real-time data feeds.

Apache Kafka has becoming popular in industry with the rise of stream processing. Many of the existing organisations are looking forward to include Kafka in their new projects, while some are trying to incorporate Kafka into their existing applications.

Currently Kafka is being used for :

  • Application Monitoring
  • Data Warehousing
  • Asynchronous Applications
  • Recommendation Engines in Online Retail
  • Dynamic Pricing Applications
  • IOT (Internet Of Things)

What is Industry telling about Kafka?

  1. Kafka is helping applications to work in a loosely coupled manner.
  2. Kafka is handling stream processing and thus became the underlying data infrastructure.
  3. Real-time processing of high volumes of data.
  4. Improvement in the Application Scalability.

Other References

If you are interested to learn Apache Kafka, you may refer following links.