Streamlit in Google Colab

Are you stuck waiting for the right time, or will you make now the right time?
Gen AI Launch Pad 2025 is your answer.
What You’ll Learn:
In this blog, we will cover:
- How to set up Streamlit in Google Colab
- Building interactive web applications using Streamlit
- Creating an OpenAI-powered chatbot
- Developing a text summarization tool with LangChain
- Incorporating web search functionality into your applications
- Deploying your Streamlit apps using LocalTunnel
By the end of this guide, you’ll be able to build and deploy interactive web apps directly from Colab, leveraging the power of Python, Streamlit, and various advanced libraries.
1. Getting Started: Setting Up Streamlit in Google Colab
Streamlit’s intuitive design allows developers to focus on functionality without worrying about front-end development. To get started, you first need to prepare your Colab environment by installing the necessary libraries.
Installation Steps:
Run the following command in your Colab notebook to install Streamlit and other essential packages:
!pip install -U langchain-community tiktoken duckduckgo-search streamlit
Explanation of Installed Libraries:
- Streamlit: The backbone of our web application, enabling us to create interactive UIs.
- LangChain-Community: A library designed to simplify the integration of large language models and AI workflows.
- Tiktoken: Optimized tokenizer for handling input text for OpenAI models.
- DuckDuckGo Search: Facilitates web search functionality directly from the app.
Once these packages are installed, you’re ready to start coding your apps.
2. Building an OpenAI-Powered Chatbot
Overview:
One of the most exciting applications of Streamlit is building chatbots. By integrating OpenAI’s GPT models, you can create conversational apps that answer questions, provide recommendations, or even assist with creative tasks.
Code for the Chatbot App:
Here’s the complete code to build a chatbot using Streamlit and OpenAI:
%%writefile app.py from openai import OpenAI import streamlit as st # Sidebar for API Key Input with st.sidebar: openai_api_key = st.text_input("OpenAI API Key", key="chatbot_api_key", type="password") "[Get an OpenAI API key](https://platform.openai.com/account/api-keys)" # Title and Caption st.title("💬 Chatbot") st.caption("🚀 A Streamlit chatbot powered by OpenAI") # Initialize Session State if "messages" not in st.session_state: st.session_state["messages"] = [{"role": "assistant", "content": "How can I help you?"}] # Display Chat Messages for msg in st.session_state.messages: st.chat_message(msg["role"]).write(msg["content"]) # Handle User Input if prompt := st.chat_input(): if not openai_api_key: st.info("Please add your OpenAI API key to continue.") st.stop() client = OpenAI(api_key=openai_api_key) st.session_state.messages.append({"role": "user", "content": prompt}) st.chat_message("user").write(prompt) response = client.chat.completions.create(model="gpt-3.5-turbo", messages=st.session_state.messages) msg = response.choices[0].message.content st.session_state.messages.append({"role": "assistant", "content": msg}) st.chat_message("assistant").write(msg)
Key Features Explained:
- Sidebar Input: Users can securely input their OpenAI API key.
- Session State: Maintains a conversation history for dynamic interactions.
- Chat Interface: Displays user queries and AI responses in a structured format.
Expected Output:
After running the app, users will see:
- A sidebar to enter their OpenAI API key.
- A chat interface where they can input queries and receive AI-generated responses.
Real-World Applications:
- Customer support bots
- Interactive teaching assistants
- Personal productivity tools
3. Fetching Public IP Address
Before deploying your app, it’s essential to know your Colab environment’s public IP address. This helps with debugging and access configuration.
Command:
!wget -q -O - ipv4.icanhazip.com
Explanation:
This command fetches your public IP address from the icanhazip.com
service.
Expected Output:
A single line displaying your public IP address, e.g., 35.234.33.244
.
4. Deploying Apps with LocalTunnel
To make your Streamlit apps accessible on the web, use LocalTunnel. This tool creates a secure tunnel from your Colab environment to the internet.
Command:
!streamlit run app.py & npx localtunnel --port 8501
Detailed Steps:
- Run the Streamlit Server: Starts the app locally on port
8501
. - LocalTunnel Command: Exposes the local server to a public URL.
Expected Output:
LocalTunnel will generate a unique public URL that redirects to your app, e.g., https://example.loca.lt
.
5. LangChain-Powered Text Summarization App
Overview:
This app allows users to input large blocks of text and receive concise summaries, leveraging LangChain’s advanced text processing capabilities.
Code for Summarization App:
%%writefile webapp.py import streamlit as st from langchain import OpenAI from langchain.docstore.document import Document from langchain.text_splitter import CharacterTextSplitter from langchain.chains.summarize import load_summarize_chain # Function to Generate Summaries def generate_response(txt): llm = OpenAI(temperature=0, openai_api_key=openai_api_key) text_splitter = CharacterTextSplitter() texts = text_splitter.split_text(txt) docs = [Document(page_content=t) for t in texts] chain = load_summarize_chain(llm, chain_type='map_reduce') return chain.run(docs) # App Configuration st.set_page_config(page_title='🦔🔗 Text Summarization App') st.title('🦔🔗 Text Summarization App') # Text Input txt_input = st.text_area('Enter your text', '', height=200) result = [] with st.form('summarize_form', clear_on_submit=True): openai_api_key = st.text_input('OpenAI API Key', type = 'password', disabled=not txt_input) submitted = st.form_submit_button('Submit') if submitted and openai_api_key.startswith('sk-'): with st.spinner('Calculating...'): response = generate_response(txt_input) result.append(response) if len(result): st.info(response)
Key Features:
- Input Area: Users can paste text for summarization.
- LangChain Integration: Uses LangChain’s efficient summarization chains.
- Dynamic Results: Displays summaries in real time.
Conclusion:
Streamlit in Colab enables developers to quickly prototype and deploy interactive web applications. Whether building a chatbot, summarization tool, or search-enabled app, the possibilities are endless. By combining the power of Streamlit, LangChain, and OpenAI, you can create scalable, impactful solutions in record time.
Resources:
- Streamlit Documentation
- OpenAI API Reference
- LangChain Documentation
- LocalTunnel
- Colab Documentation
- Streamlit + Collab Experiment Notebook
---------------------------
Stay Updated:- Follow Build Fast with AI pages for all the latest AI updates and resources.
Experts predict 2025 will be the defining year for Gen AI implementation.Want to be ahead of the curve?
Join Build Fast with AI’s Gen AI Launch Pad 2025 - your accelerated path to mastering AI tools and building revolutionary applications.
---------------------------
Resources and Community
Join our community of 12,000+ AI enthusiasts and learn to build powerful AI applications! Whether you're a beginner or an experienced developer, this tutorial will help you understand and implement AI agents in your projects.
- Website: www.buildfastwithai.com
- LinkedIn: linkedin.com/company/build-fast-with-ai/
- Instagram: instagram.com/buildfastwithai/
- Twitter: x.com/satvikps
- Telegram: t.me/BuildFastWithAI