Skip to main content

· 6 min read

Chatbot_Anansi

A chatbot is a software application that uses AI to have conversations with users, helping them find information or answer questions. We built this chatbot using Retrieval-Augmented Generation (RAG) to improve its responses, Neo4j to store structured data, and Large Language Models (LLMs) to understand and generate natural language.

We created 2 types of Nodes/Labels, "Bank" and "Owner" and 1 type of relationship between them: "IS_OWNED_BY". The blog below lays out how we created a chatbot to query the relationship between the Node Types mentioned using RAG (Retrieval Augmented Generation) techniques.

· 8 min read

In modern software development, having well-documented APIs is crucial for collaboration and integration. Postman is a popular tool for testing and documenting APIs. Open API Specification is a widely accepted standard for defining RESTful APIs. In this blog post, we'll walk you through the process of converting a Postman collection into an OpenAPI specification and then using the OpenAPI Generator to create a JavaScript client from it.

There are 2 different ways to approach API development.

  1. Design First: Postman -> Swagger/Open Api Spec -> Code

  2. Code First: Code -> Postman Code -> Swagger

In this blog post, we will explore the 'Design First' approach to API development.

Design First Vs Code First - High level differences​

AspectDesign First ApproachCode First Approach
Initial Project ConceptRequires a clear understanding of project goals and requirements from the start.Allows for a more flexible approach, where the initial project concept may not be well-defined.
Team CompositionInvolves a cross-functional design team, including architects and designers.Relies on a team of developers, testers, and domain experts who can adapt to changing requirements.
Starting PointBegins with a detailed design and specification of the software system.Starts by writing code to create a basic functional prototype or MVP.
FlexibilityMay lead to more rigid development, with a focus on adhering to the initial design.Offers greater adaptability to evolving requirements and allows for frequent changes.
Feedback GatheringFeedback is primarily gathered during the design and specification phase.Feedback is collected continuously throughout the development process based on the evolving codebase.
DocumentationDetailed design documentation is created before implementation.Documentation is created as the design and specifications emerge during development.
Testing and Quality AssuranceTesting is performed against the well-defined design.Testing is integrated throughout the development process to ensure code quality.
Stakeholder InvolvementStakeholder involvement primarily during design and specification phases.Continuous stakeholder involvement and feedback are encouraged throughout development.
Refactoring and OptimizationDesign changes are costly and may require significant rework.Code is refactored and optimized as needed to maintain quality and adapt to changing requirements.
Initial DeploymentDeployment may occur later in the development process.Allows for continuous deployment of prototypes and updates to gather real-world feedback.
Ongoing MaintenanceMaintenance activities are largely predictable based on the initial design.Ongoing maintenance is essential, with a focus on adapting to evolving needs and addressing emerging issues.

Both approaches have their advantages and are suited to different project scenarios. The choice between "Design First" and "Code First" depends on the project's specific requirements, goals, and constraints.

Convert a Postman Collection to a Swagger Specification​

There are several methods to convert a Postman collection to a Swagger Specification. In this section, we'll discuss some of the most popular ways to accomplish this task:

  1. Swagger Editor: One of the most common methods is to use the Swagger Editor. This web-based tool provides a user-friendly interface for creating and editing Swagger specifications. It allows you to manually convert your Postman collection into a Swagger Specification by copy-pasting the relevant information.

  2. Stoplight Studio: Another excellent option is Stoplight Studio, an integrated development environment for designing and documenting APIs. Stoplight Studio provides features that can help streamline the process of converting a Postman collection to Swagger.

  3. npm Package: If you prefer a programmatic approach, you can explore various npm packages and libraries that are designed to automate the conversion process. These packages often provide command-line tools and scripts to convert your collection to a Swagger Specification, making the process more efficient.

Each of these methods has its advantages and is suitable for different use cases. You can choose the one that best fits your workflow and requirements.

However, for the purpose of this demo, we are showing how to perform the conversion through the npm Package.

Step 1: Installing the "postman-to-openapi" Package​

The first step is to install the "postman-to-openapi" package, which allows you to convert Postman collections to OpenAPI specifications. To do this, open your terminal and run the following command:

npm i postman-to-openapi -g

  • npm: This is the Node Package Manager, a package manager for JavaScript that is commonly used to install and manage libraries and tools.

  • i: This is short for "install," and it's the npm command used to install packages.

  • postman-to-openapi: This is the name of the package you want to install.

  • -g: This flag stands for "global," and it tells npm to install the package globally on your system, making it available as a command-line tool that you can run from any directory.

This command uses the Node Package Manager (npm) to install the package globally on your system. Once installed, you'll be able to use the "postman-to-openapi" command as a command-line tool.

Step 2: Converting Postman Collections to OpenAPI Specifications​

Now that you have "postman-to-openapi" installed, you can convert your Postman collections to OpenAPI specifications. Replace ~/Downloads/REST_API.postman_collection.json with the path to your Postman collection and specify where you want to save the resulting OpenAPI specification. Use the following command:

p2o ~/Downloads/REST_API.postman_collection.json -f ~/Downloads/open-api-result.yml

  • p2o: This is the command you use to run the postman-to-openapi tool. It is installed globally on your system using npm i postman-to-openapi -g, as mentioned earlier.

  • ~/Downloads/REST_API.postman_collection.json: This is the path to your Postman collection file. You should replace this with the actual file path to your Postman collection that you want to convert.

  • -f: This flag is used to specify the output format of the converted OpenAPI specification.

  • ~/Downloads/open-api-result.yml: This is the path where the resulting OpenAPI specification will be saved as a YAML file. You should replace this with the desired file path where you want to save the OpenAPI specification.

This command will take your Postman collection, process it, and save the resulting OpenAPI specification as a YAML file. Ensure that you provide the correct file paths.

Covert Open Api spec to Javascript code​

Step 1: Configuring Your Java Environment​

If you plan to generate code from your OpenAPI specification, you'll need a Java environment set up. Use the following commands to set the JAVA_HOME environment variable and update the PATH:

export JAVA_HOME=/usr/lib/jvm/java-1.11.0-openjdk-amd64 export PATH=$JAVA_HOME:$PATH

  • export JAVA_HOME=/usr/lib/jvm/java-1.11.0-openjdk-amd64: This command sets the JAVA_HOME environment variable to the specified directory, which is typically the root directory of a specific JDK version. In this case, it appears to be pointing to the root directory of OpenJDK 11.

  • export PATH=$JAVA_HOME:$PATH: This command appends the JAVA_HOME directory to the beginning of the PATH environment variable. The PATH variable is a list of directories where the system looks for executable files, so adding the JAVA_HOME directory to the PATH allows you to easily run Java-related commands and tools.

These commands are essential for Java development and ensure that the correct Java version is used in your development environment.

Step 2: Generating JavaScript Code from OpenAPI​

To complete our journey, we'll use the OpenAPI Generator CLI to generate code from our OpenAPI specification. First, download the CLI JAR File file using wget:

wget https://repo1.maven.org/maven2/org/openapitools/openapi-generator-cli/7.0.1/openapi-generator-cli-7.0.1.jar -O openapi-generator-cli.jar

This command uses wget to download the CLI JAR file and renames it as openapi-generator-cli.jar.

Finally, run the OpenAPI Generator CLI to create code from your OpenAPI specification:

java -jar ./openapi-generator-cli.jar generate -i ./open-api-result.yml -g javascript -o ./nodejs_api_client

  • java -jar ./openapi-generator-cli.jar: This part of the command runs the Java JAR file (openapi-generator-cli.jar) using the java command. The JAR file is responsible for generating code from the OpenAPI specification.

  • generate: This is a command provided by the OpenAPI Generator CLI to instruct it to generate code based on the OpenAPI specification.

  • -i ./open-api-result.yml: This flag specifies the input OpenAPI specification file. In this case, it's using the file named open-api-result.yml.

  • -g javascript: This flag specifies the target generator. In this case, it's generating JavaScript code.

  • -o ./nodejs_api_client: This flag specifies the output directory where the generated code will be placed. In this case, the code will be generated in a directory called nodejs_api_client in the current working directory.

This command invokes the CLI JAR.

In summary, these are the steps guide you through converting a Postman collection into an OpenAPI specification and then generating code from that specification. It's a streamlined process that aids in API development and documentation. By following these commands and tools, you can enhance your development workflow and collaboration.

References​

· 3 min read

Dockerfile​

A Dockerfile is a text document that contains all the commands a user could call on the command line to assemble an image.

GitHub Actions​

GitHub Actions is a continuous integration and continuous delivery (CI/CD) platform that allows you to automate your build, test, and deployment pipeline. You can create workflows that build and test every pull request to your repository, or deploy merged pull requests to production.

· 11 min read

What is cryptography?​

Cryptography is the practice of secure communication in the presence of third parties. It involves the use of mathematical algorithms to encode and decode messages, making it difficult for unauthorized individuals to read or modify the data being transmitted. It involves techniques such as encryption, hashing, and digital signatures to ensure that messages are protected from unauthorized access or tampering.

· 4 min read

Introduction​

Docker is a popular tool used by developers and IT professionals to streamline the process of building, packaging, and deploying applications. It allows applications and their dependencies to be packaged into lightweight, portable, and self-sufficient containers that can be run consistently across different environments.

What are Containers?​

Containers are lightweight and portable packages that contain everything needed to run a software application, including the code, libraries, and settings. Containers are similar to virtual machines but are more lightweight and efficient, as they don't require a separate operating system. They are widely used in modern software development and deployment practices to package, distribute, and run applications in a consistent and reproducible way.

What is Docker?​

Docker Docker is an open-source platform that allows you to package and run applications in containers, providing a lightweight, portable, and isolated way to develop, deploy, and manage applications across different computing environments.

Docker Architecture:​

Docker Architecture

Docker Client: It is the primary component to interact with the docker host.

Docker Host /Server: It contains the main program called Docker-Daemon which listens to all API calls from the docker client and manages the docker objects such as containers, images, volumes, and networks.

Docker Registry: A Docker registry is a centralized repository for storing and sharing Docker images. Docker Hub is the default public registry, which contains a large number of official and community-contributed Docker images.

Docker Concepts:​

Docker Image: A Docker image is a read-only template that contains everything needed to run a container, including the application, code, and configurations packed into a single file called Docker-Image. Docker images are stored in a registry, such as Docker Hub, from which they can be pulled to create containers.

Docker Container: Container is a running instance of a Docker Image. Containers are isolated from each other and from the host system, providing consistency and reproducibility across different environments.

Docker Compose: It is used to define applications by using multiple docker containers.

Docker Swarm: It is a collection of docker engines, if the service on the engine or node is down, that service can be run on another engine. It is a technique to create and maintain a cluster of docker engines. Service deployed on one node can be accessed on other nodes in the same cluster.

Docker Volumes: Volumes in docker is essentially a directory inside docker. It can be directly accessed by the container and it’s kept when you remove the container.

Difference between Bind Mount Volumes and Normal Volumes

Bind Mount VolumesNormal Volumes
Bind mount volumes are managed by UsersNormal volumes are completely managed by the docker
It is dependent on the directory structureIt’s not dependent on the directory structure
We need to take care of backup and migration and it’s not easy to take back-upIt is easy to take back-up and perform migration

What are Docker Networks?​

Docker networks are virtual networks that allow containers to communicate with each other securely and efficiently. There are 3 types of networks:

Bridge Network: A bridge network is a private network that allows containers to communicate with each other on the same host or across hosts. Containers in a bridge network are connected to a virtual switch, which allows them to communicate securely and efficiently using container names or IP addresses.

Host Network: Here we create a container directly on the host network. Where the network configuration of a container should be the same as the network configuration of the host.

Null Network: It is used to create a container that doesn’t have any network interface. As there is no network interface, we can’t expose it externally and it is used to store data.

Basic Docker Commands​

DescriptionCommand
Build the image out of the docker filedocker build –t image-name
Bring up the container from the docker filedocker run –d image-name
To get into a containerdocker exec –it container-id /bin/bash
To run the image into a containerdocker run –itd image-name
Check container statusdocker ps
To create imagedocker commit container-id new-image-name
To share the same data b/w multiple containersdocker volume –driver=nfs volume_name

In summary, Docker provides portability, consistency, efficiency, reproducibility, flexibility, scalability, DevOps integration, security, and a vibrant community, making it a powerful tool for modern application development and deployment.