|| AgenticAI and GenAI with Cloud Course


This course provides an in-depth understanding of the next generation of Artificial Intelligence through Agentic AI and Generative AI, combined with the practical power of Cloud Computing. Students will learn to design and deploy intelligent, autonomous agents capable of decision-making, collaboration, and goal-oriented behavior using cutting-edge frameworks.

The course also covers Generative AI models, including Large Language Models (LLMs), that can generate human-like text, images, and more. Learners will explore prompt engineering, fine-tuning, and API integration to create real-world GenAI applications.

To ensure industry relevance and real-world applicability, students will be introduced to cloud platforms like AWS, Azure, and GCP, enabling the deployment of scalable and secure AI services. Topics such as AI workflows, serverless computing, data storage, and CI/CD pipelines for AI are also covered.

|| What will I learn?

  • Explain the fundamentals of Agentic AI, including autonomous agents, reasoning, goal completion, and task delegation.
  • Differentiate between traditional AI systems and agent-based models.
  • Build and interact with Generative AI models for tasks like text generation, image creation, summarization, and more.
  • Utilize APIs such as OpenAI, Hugging Face, or similar to create GenAI-powered applications.
  • Use cloud services (AWS, Azure, or GCP) to host, scale, and manage AI applications.
  • Integrate cloud tools for data storage, model inference, and continuous deployment.
  • Design intelligent systems combining Agentic AI, Generative AI, and cloud-based pipelines.
  • Implement real-world AI solutions with modular, reusable components.

|| What will I learn?

  • Explain the fundamentals of Agentic AI, including autonomous agents, reasoning, goal completion, and task delegation.
  • Differentiate between traditional AI systems and agent-based models.
  • Build and interact with Generative AI models for tasks like text generation, image creation, summarization, and more.
  • Utilize APIs such as OpenAI, Hugging Face, or similar to create GenAI-powered applications.
  • Use cloud services (AWS, Azure, or GCP) to host, scale, and manage AI applications.
  • Integrate cloud tools for data storage, model inference, and continuous deployment.
  • Design intelligent systems combining Agentic AI, Generative AI, and cloud-based pipelines.
  • Implement real-world AI solutions with modular, reusable components.

|| Requirements

  • Enrollment in Bachelor of Information Technology (BIT) or related undergraduate program
  • Completion of foundational subjects such as:
  • Programming Fundamentals (preferably Python)
  • Basics of Artificial Intelligence or Machine Learning (optional but recommended)
  • Introduction to Cloud Computing (desirable)

|| Requirements

  • Enrollment in Bachelor of Information Technology (BIT) or related undergraduate program
  • Completion of foundational subjects such as:
  • Programming Fundamentals (preferably Python)
  • Basics of Artificial Intelligence or Machine Learning (optional but recommended)
  • Introduction to Cloud Computing (desirable)

    The Agentic AI and Generative AI with Cloud course offered by BIT is designed to equip students with cutting-edge knowledge and practical skills in the fast-evolving field of artificial intelligence. This course introduces learners to the powerful concepts of Agentic AI, where intelligent agents can plan, reason, and act autonomously, and Generative AI, where models can create text, images, and more using advanced language models like GPT. Students will gain hands-on experience with popular frameworks such as LangChain, LangGraph, and CrewAI, while also learning to deploy these solutions using cloud platforms like AWS, Microsoft Azure, and Google Cloud Platform (GCP). Through real-world projects and cloud-integrated pipelines, the course emphasizes building scalable, intelligent systems that solve complex tasks—preparing students for industry-ready roles in AI, cloud computing, and automation.

    ·      Module 1 -  Introduction to Agentic AI

    ·      What is Agentic AI?

    ·      What are Agents? Agentic AI vs AI Agents, Agentic 

         AI vs Generative AI, What are Multi-Agents?

    ·      Agentic AI Frameworks

    ·      Overview of Agentic AI Frameworks

     

    ·      Module 2 - Phi Data: Agentic AI Framework

    ·      Core Concepts

    ·      Agents in Phi Data, Models, Tools, Knowledge, 

        Chunking

    ·      Data and Storage

    ·     Vector Databases (VectorDbs), Storage, 

        Embeddings

    ·      Workflows

    ·      Workflow Design and Execution

    ·      Use Cases

    ·      Web Search Agents, Financial Agents, Retrieval-

         Augmented Generation (RAG)  Agents

        

    ·      Module 3 - LangChain

    ·      Core Components and Data Handling

    ·      Introduction to Basic Components and Modules in

         LangChain, Data Ingestion with Document Loaders

    ·      Text Splitting Techniques

    ·      Recursive Character Text Splitter, Character Text

         Splitter, HTML Header Text  Splitter, Recursive       

         JSON Splitter

        

    ·      Embeddings and Vector Storage

    ·      OpenAI Embedding, Ollama Embedding, Hugging

         Face Embedding, VectorStores:    FAISS and

         ChromaDB, VectorStore and Retriever

     

    ·      Module 4 - LCEL (LangChain Expression

         Language)

    ·      Getting Started

    ·      Open Source Models Using Groq API

    ·      Building and Deploying

    ·      Building LLMs, Prompt and Structured Output

         Chains with LCEL, Deploying LangServe

         Runnables and Chains as APIs

     

    ·      Module 5 - LangServe for Efficient AI

         Deployment

    ·      Overview and Setup

    ·      Overview of LangServe and Its Capabilities,

         Importance of Efficient AI Model Serving, Key

         Features and Benefits of LangServe, Setting Up the

         LangServe Environment, Installing LangServe and

         Initial Configuration, Configuring Environment

         Variables and Dependencies

    ·      Model Deployment

    ·      API-Driven Model Serving: How LangServe Bridges

         AI Models and Applications, Deploying Your Model

         with LangServe, Creating and Managing Custom

         Endpoints, Integrations with External Tools

     

    ·      Module 6 - LangGraph

    ·      Core Concepts

    ·      Introduction, Simple Graph, LangGraph Studio,

         Chain, Router

    ·      Agents

    ·      Agent, Agent with Memory, Intro to Deployment

    ·      State Concepts

    ·      State Schema, State Reducers, Multiple Schemas

    ·      Message Handling

    ·      Trim and Filter Messages

    ·      Deployment Concepts

    ·      Deployment Concepts, Creating and Connecting to

         Deployment

     

    ·      Module 7 - UX and Human-in-the-Loop with

         LangGraph

    ·      Interaction

    ·      Streaming, Breakpoints, Editing State and Human

         Feedback, Dynamic Breakpoints

    ·      Time Travel

    ·      Time Travel

     

    ·      Module 8 - Agentic RAG

    ·      Adaptive RAG

    ·      Adaptive Rag, Adaptive Rag with Cohere, Adaptive

         rag in Local

    ·      RAG Variants

    ·      Agentic Rag, C-Rag, C-Rag in Local, Self Rag, Self

         Rag in Local, Self Rag with VectorDB

     

    ·      Module 9 - Designing Multi-Agent Systems with

         LangGraph

    ·      Agent Design

    ·      Building Agent Nodes in LangGraph, Agent

         Communication Protocols and Coordination,

         Defining Tasks and Roles for Agents

    ·      System Design

    ·      Creating Scalable Multi-Agent Systems in

         LangGraph, Building A Real-World Multi-Agent 

         System

     

    ·      Module 10 - CrewAI Platform

    ·      Overview

    ·      Definition and Overview, Key Features and

         Capabilities, Crew Collaboration Framework

    ·      Collaboration and Tools

    ·      AI-Agent Communication, Workflow Automation in

         CrewAI, Customizing CrewAI, Managing Data

         Across Agents, Role-playing, Memory, Tools, Focus,

         Guardrails, Cooperation, Using LangChain Tools

     

    ·      Module 11 - LangFlow Overview and Setup

    ·      Introduction and Setup

    ·      What is LangFlow? Overview and Use Cases, Key

         Features of LangFlow for LLM Applications, Setting

         Up Your LangFlow Environment

    ·      LangFlow UI and Terminologies

    ·      Understanding LangFlow UI and Workflows, Key

         Terminologies in LangFlow (Nodes, Chains,

         Prompts)

    ·      Quick Start

    ·      Quick Start: Creating Your First LangFlow

         Application

    ·      Core Concepts

    ·      Nodes and Chains: Core Concepts, Understanding

         LLMs and Their Integration with LangFlow, Pre-built

         vs. Custom Workflows

    ·      LangChain and Prompt Engineering

    ·      Prompt Engineering Basics in LangFlow, LangChain

         Integration: Using LangFlow with LangChain

    ·      Commonly Used Nodes

    ·      Exploring Commonly Used LangFlow Nodes

     

    ·      Module 12 - Integration with Third-Party Tools

    ·      Data Integration

    ·      Connecting LangFlow with Data Sources (SQL,

         CSV, NoSQL), Using LangFlow with Vector

         Databases for Embeddings

    ·      API Integration

    ·      API Integration for External Services (REST,

        GraphQL), LangFlow with OpenAI and Hugging

        Face Models

    ·      Workflow Automation

    ·      Automating Workflows Using LangFlow, Building

         Chatbot Applications with LangFlow

     

    ·      Module 13 - Langfuse for LLM Observability

    ·      Langfuse Overview

    ·      What is Langfuse? Overview and Applications,

         Importance of Observability in LLMs, Key Features

         and Benefits of Langfuse, Understanding

         Langfuse's Integration Ecosystem

    ·      Integration and Monitoring

    ·      Step-by-Step Integration with Popular Frameworks

         (LangChain, OpenAI, etc.), Setting Up API Calls for

          Observability, Tracking Key Metrics: Response

          Times, Costs, and Errors, Monitoring Prompt

          Effectiveness and Token Usage.

     

    ·      Module 14  -Metrics and Monitoring in

         LangWatch

    ·      LangWatch Overview

    ·      What is LangWatch? Overview and Use Cases, Key

         Features of LangWatch in Monitoring Language

         Models, Connecting LangWatch with LLMs

    ·      API Integration and Setup

    ·      API Integration: Sending Logs and Data to

         LangWatch, Setting Up Observability in AI

         Workflows

    ·      Using LangWatch with Frameworks

    ·      Using LangWatch with Popular Frameworks

     

    ·      Module 15 - Langsmith

    ·      Langsmith Overview

    ·      What is LangSmith? Overview and Key Features,

         LangSmith in the AI Development Workflow

    ·      Setup and Configuration

    ·      Setting Up LangSmith: Installation and

         Configuration, Exploring the User Interface and

         Core Functionalities

    ·      Workflow Management

    ·      Understanding Workflow Pipelines in LangSmith,

        Creating and Managing AI Workflows, Data

        Integration in LangSmith, Preprocessing and

        Cleaning Data, Managing Data Streams and

        Sources.

     

    ·      Module 16 - Introduction to Autogen

    ·      Framework Overview

    ·      Overview, Key Concepts: Autonomy, Adaptability,

         and Inter-Agent Communication, Installation and

         Environment Setup

    ·      Agentic System Development

    ·      Introduction to Agents, Goals, Environments, and

         Actions, APIs, Libraries, and Tools Available Within

         the Autogen Framework, Designing and Developing

        Agentic Systems, Framework for Agentic Decision-

        Making

    ·      Agent Interaction and Learning

    ·      Interaction and Communication Between Agents,

         Implementing Feedback Loops, Handling

         Uncertainty and Constraints, Agent Learning and

         Adaptation, Multi-Agent Collaboration

    ·      Deployment and Monitoring

    ·      Deployment, Monitoring Agent Performance

     

    ·      Module 17 - End to End Agentic AI Projects

    ·      Project-Based Learning

    ·      Agentic AI Projects

     

    ·      Module 18 - AWS Cloud & Services for Generative AI

    ·      Introduction to AWS Cloud

    ·      Detail introduction of AWS Cloud services, how to

        create an AWS account, how to create an IAM,

        Understanding Regions and Zones

    ·      AWS Compute and Container Services

    ·      AWS Elastic Container Registry, AWS Elastic Cloud

        Compute, AWS App Runner

     

    ·      Module 19 - AWS Bedrock

    ·      Introduction to AWS Bedrock

    ·      Amazon Bedrock - Introduction, Bedrock Console

        Walkthrough, Amazon Bedrock - Architecture

    ·      Bedrock Models and Use Cases

    ·      Bedrock Foundation Models, Bedrock Embeddings,

         Bedrock Chat Playgrounds

    ·      Bedrock Inference and Pricing

    ·      Amazon Bedrock - Inference Parameters, Bedrock

         Pricing

     

    ·      Module 20 - AWS SageMaker

    ·      Overview of AWS SageMaker

    ·      AWS SageMaker Overview, AWS SageMaker Walk-

         through, AWS SageMaker Studio Overview, AWS

         SageMaker Studio Walk-through

    ·      Model Deployment with SageMaker

    ·      Choose a Pre-trained Model, SageMaker Endpoint

         Creation, SageMaker Console Access, Create

         SageMaker Domain, Open SageMaker Studio,

         SageMaker Models Deployment

     

    ·      Module 21 - AWS Lambda

    ·      Overview of AWS Lambda

    ·      Overview of AWS Lambda, Lambda Console

         Walkthrough, Lambda Permissions Model

     

    ·      Module 22 - AWS API Gateway

    ·      API Gateway Overview

    ·      AWS API Gateway, RESTful APIs, Web Socket APIs

    ·      Efficient API Development

    ·      Efficient API Development

     

    ·      Module 23 - Text Summarization with AWS

        Services

    ·      Integration of AWS Lambda with Bedrock and

         API Gateway

    ·      Creation of AWS Lambda function and Boto3

         upgrade, Writing the AWS Lambda function to

         connect to Bedrock Service, Create REST API

         using AWS API Gateway and Lambda Integration

     

    ·      Module 24 - Fine-Tuning Foundation Models on

         Custom Data

    ·      Fine-Tuning Overview

    ·      Fine-Tuning of Foundation Model - Overview, Fine-

         Tuning of Foundation Model – Architecture

    ·      Hands-On with AWS SageMaker

    ·      Fine-Tuning of Foundation Models - Hands On AWS

         SageMaker

     

    ·      Module 25 - Project: AWS

    ·      Retrieval-Augmented Generation (RAG) in AWS

    ·      Overview, Setup, Data Transformation and

         Processing, LLM and Retrieval QA, Frontend and

         Backend Development

    ·      Building Chatbot with Llama3, Langchain &

         Streamlit

    ·      Overview, Setup, Data Handling and LLM Creation,

         Frontend and Final Demo

     

    ·      Module 26 - GCP Basics & Introduction to Vertex

        AI

    ·      Introduction to Google Cloud and Vertex AI

    ·      What is Vertex AI? Google AI Studio Introduction,

         Google Cloud Regions & Zones, Foundation

         Google Models

    ·      Vertex AI Setup and Installation

    ·      Vertex AI Installation, Google Cloud Setup for

         Production, Vertex AI Overview, Vertex AI Model

        Garden

     

    ·      Module 27 - Gemini Models with Vertex AI and 

         Google AI Studio

    ·      Introduction to Google Gemini

    ·      What is Google Gemini? Google Gemini: Playing

         with Gemini, Gemini 1.5 Pro (Preview only), Gemini

        1.0 Pro

    ·      Gemini Embeddings and Retrieval

    ·      Gemini Embeddings, Advanced Information

         Retrieval with Gemini

    ·      Working with Prompts

    ·      Working with Freeform & Structured prompts,

         Working with Text Chat prompt

    ·      Multimodal and Text-Based Use Cases

    ·      Generate Code, Unit test with Code Chat Bison

         model, Translate text with Translation LLM,

         Summarization, Classification

    ·      Multimodal Applications

    ·      Vision Model, Speech to Text & Text to Speech,

         Multimodal Prompts

     

    ·      Module 28 - Project: GCP

    ·      Retrieval-Augmented Generation (RAG) in GCP

    ·      Overview, Setup, Data Transformation and LLM

         Context, Frontend and Final Demo

    ·      Building Chatbot with Gemini Pro, Langchain &

        Streamlit in GCP

    ·      Overview, Setup, Data Transformation and LLM

         Creation, Frontend and Final Demo

     

Get in touch

placement report placement report

|| Frequently asked question

The course aims to equip students with the skills to build intelligent agents (Agentic AI), create content-generating systems (Generative AI), and deploy scalable solutions using leading cloud platforms like AWS, Azure, and GCP.

This course is designed for BIT students who are interested in artificial intelligence, cloud technologies, automation, and real-world AI application development.

The fundamentals and frameworks of Agentic AI The workings and applications of Generative AI models (e.g., LLMs) Prompt engineering, fine-tuning, and API integration How to use cloud platforms to deploy AI models and services Building and deploying full-stack AI solutions in the cloud

Basic understanding of Python programming and foundational concepts in machine learning or AI is recommended. Familiarity with cloud platforms is a plus but not mandatory.

Agentic AI tools and frameworks (e.g., LangChain, LangGraph, CrewAI) Generative AI APIs (e.g., OpenAI, Hugging Face) Cloud platforms: AWS, Microsoft Azure, Google Cloud Platform Tools like Docker, Streamlit, and API testing utilities

Lab sessions with cloud integration Mini-projects using GenAI and Agentic AI A capstone project deploying a full AI pipeline on the cloud

AI Developer / AI Engineer Cloud AI Solutions Architect Automation Specialist Machine Learning Engineer Innovation or R&D roles in AI product companies

Yes, students will receive a course completion certificate from BIT, and in some cases, they may also earn cloud platform badges or GenAI tool certifications depending on the tools used.
-->
Call Now!