Introduction

Modern retail stores need more than just basic inventory tracking. They need intelligent assistants that can answer questions about products, stock levels, and sales trends. In this tutorial, I'll walk you through building a conversational store assistant bot that leverages Google Cloud Tools such as Firestore, BigQuery, and Google's Vertex AI to create a natural language interface for your store's inventory data.

What you'll learn:

  • Setting up a BigQuery dataset for product history tracking
  • [Optional] Migrating store data from Firestore to BigQuery (Skip if your data is already in BigQuery or another database)
  • Creating REST API endpoints for product analytics
  • Building an AI-powered store assistant using Vertex AI's Gemini model

Prerequisites

Before we begin, make sure you have:

  • A Google Cloud Platform account
  • Firebase project set up with Firestore (Required for firestore and Google cloud functions)
  • Node.js and npm installed
  • Basic understanding of Express.js
  • Firebase CLI installed (npm install -g firebase-tools)

Setting Up The Environment

First, you'll need to set up the necessary permissions and service accounts:

  1. Create a service account in the GCP console with the following roles:

    • BigQuery Admin
    • Firebase/Firestore Admin
    • Vertex AI User
  2. Download the service account key file and store it securely

  3. Required npm packages for our project:

    • firebase-admin
    • @google-cloud/bigquery
    • express
    • cors
    • firebase-functions
    • axios
    • @google-cloud/vertexai

Step 1: Creating a BigQuery Table for Product History Data

Let's start by creating our BigQuery table to store product transaction history by running the following code:

const serviceAccountPath = "./path-to-your-service-account.json";
const { BigQuery } = require('@google-cloud/bigquery');

// Initialize services
const bigquery = new BigQuery({
  keyFilename: serviceAccountPath,
  projectId: "your-project-id"
});

async function createTable() {
  const datasetId = 'store_assistant';
  const tableId = 'product_history';

  // Define schema based on your store's product data structure
  const schema = [
    { name: 'transaction_id', type: 'STRING', mode: 'REQUIRED' },
    { name: 'product_id', type: 'STRING' },
    { name: 'category_id', type: 'STRING' },
    { name: 'sku', type: 'STRING' },
    { name: 'quantity', type: 'NUMERIC' },
    { name: 'unit_count', type: 'NUMERIC' },
    { name: 'current_stock', type: 'NUMERIC' },
    { name: 'unit_price', type: 'NUMERIC' },
    { name: 'product_name', type: 'STRING' },
    { name: 'product_code', type: 'STRING' },
    { name: 'attributes', type: 'STRING' },
    { name: 'status', type: 'STRING' },
    { name: 'category', type: 'STRING' },
    { name: 'tags', type: 'STRING' },
    { name: 'notes', type: 'STRING' },

    // Timestamp
    { name: 'transaction_date', type: 'TIMESTAMP' }
  ];

  const options = {
    schema,
    location: 'US' (optional)
  };

  try {
    // Create the dataset if it doesn't exist
    await bigquery.createDataset(datasetId, {
      location: 'US'
    }).catch(err => {
      if (err.code !== 409) throw err; // 409 means dataset already exists
    });

    // Create the table
    const [table] = await bigquery.dataset(datasetId).createTable(tableId, options);
    console.log(`✅ Table ${table.id} created successfully.`);
  } catch (error) {
    console.error('Error creating table:', error);
  }
}

createTable().catch(console.error);

Step 2: [Optional] Migrating Data from Firestore to BigQuery

Now, let's migrate your existing store data from Firestore to BigQuery for advanced analytics:

const admin = require('firebase-admin');
const { BigQuery } = require('@google-cloud/bigquery');

// Initialize Firebase
admin.initializeApp({
  credential: admin.credential.cert('./path-to-your-service-account.json') //You can also initialize it as default
});

const db = admin.firestore();

// Initialize BigQuery
const bigquery = new BigQuery({
  keyFilename: './path-to-your-service-account.json'
});

const datasetId = 'store_assistant';
const tableId = 'product_history';

// Define fields structure as per you requirements
//========NOTE=====
//This is not required if you data has already the required structure
const nestedFields = [
  "quantity", "unit_count", "current_stock", "unit_price",
  "product_name", "product_code", "attributes", "status",
  "category", "tags", "notes", "transaction_date"
];
const directFields = ["product_id", "category_id", "sku"];

// Helper to parse numbers safely
function parseNumeric(value) {
  const num = parseFloat(value);
  return isNaN(num) ? null : num;
}

async function processDocs() {
  let lastDoc = null;
  const batchSize = 1000; // Adjust based on your document size
  let processedCount = 0;
  let buffer = [];

  console.log('🚀 Starting migration from Firestore to BigQuery...');

  while (true) {
    // Build query with pagination
    let query = db.collection('product_history').orderBy('product_id').limit(batchSize);
    if (lastDoc) query = query.startAfter(lastDoc);

    const snapshot = await query.get();
    if (snapshot.empty) break;

    // Process each document
    snapshot.docs.forEach(doc => {
      const data = doc.data();
      const result = { transaction_id: doc.id };

      // Process nested fields (fields with values array)
      nestedFields.forEach(field => {
        const fieldData = data[field];
        let value = (fieldData && Array.isArray(fieldData.values) && fieldData.values.length > 0)
          ? fieldData.values[0]
          : "";

        // Convert values based on field type
        switch (field) {
          case 'quantity':
          case 'unit_count':
          case 'current_stock':
          case 'unit_price':
            value = parseNumeric(value) || null;
            break;
          case 'transaction_date':
            if (typeof value === 'string') {
              const date = new Date(value);
              value = isNaN(date.getTime()) ? null : date;
            } else if (value && value._seconds) {
              // Handle Firestore timestamps
              value = new Date(value._seconds * 1000);
            } else {
              value = null;
            }
            break;
          default:
            value = typeof value === 'string' ? value : String(value ?? '');
        }

        result[field] = value;
      });

      // Process direct fields
      directFields.forEach(field => {
        result[field] = data[field] || "";
      });

      buffer.push(result);
      processedCount++;
    });

    // Insert batch to BigQuery
    if (buffer.length > 0) {
      await bigquery.dataset(datasetId).table(tableId).insert(buffer, {
        ignoreUnknownValues: true,
        skipInvalidRows: true
      });
      console.log(`✅ Inserted ${buffer.length} rows into BigQuery`);
      buffer = [];
    }

    lastDoc = snapshot.docs[snapshot.docs.length - 1];
    console.log(`📊 Processed ${processedCount} documents...`);
  }

  console.log(`\n🎉 Migration complete! Migrated ${processedCount} records to BigQuery.`);
}

processDocs().catch(err => {
  console.error('❌ Error during migration:', err);
});

Note: If your data is already formatted, then skip the formatting logic.
Migrate it straight to the BigQuery

Step 3: Creating API Endpoints for Store Assistant Bot

Now let's create API endpoints that will provide various product analytics for the store assistant bot. We'll use Google cloud functions to host our storeAPI and storeBot:

**Create directory and initialize firebase cloud functions in it. Please checkout documentation for how to initialize.

  • Create storeAPI.js inside functions directory and create express API app and export it.

Note: API end points depends on you requirements

const cors = require("cors");
const { BigQuery } = require("@google-cloud/bigquery");
const express = require("express");
const { onRequest } = require("firebase-functions/v2/https");
const { logger } = require("firebase-functions");
const app = express();

// Middleware
app.use(cors({ origin: true }));
app.use(express.json());

// Initialize BigQuery
const bigquery = new BigQuery();

const datasetId = "store_assistant";
const tableId = "product_history";

// Test endpoint
app.get("/test", async (req, res) => {
  logger.info("This is test end point");
  res.status(200).json({ status: "ok", message: "Store Assistant API is working" });
});


// 1. Get product history for a specific product
app.get(
  "/product-history",
  async (req, res) => {
    const productCode = req.query.productCode;
    logger.info("product-history: Received request for productCode:", productCode);

    if (!productCode) {
      logger.warn("product-history: Missing productCode parameter");
      return res.status(400).json({
        error: "Missing information",
        message: "Please provide a product code to check history"
      });
    }

    const query = `
    SELECT *
    FROM \`${datasetId}.${tableId}\`
    WHERE product_code = @productCode
    ORDER BY transaction_date DESC
    `;

    const options = {
      query,
      params: { productCode },
    };

    try {
      const [rows] = await bigquery.query(options);
      logger.info(`product-history: Query completed, found ${rows.length} records`);

      if (rows.length === 0) {
        return res.status(404).json({
          success: false,
          message: `No history found for product with code ${productCode}`,
          data: {
            productCode: productCode,
            historyCount: 0
          }
        });
      }

      res.status(200).json({
        success: true,
        message: `Product ${rows[0].product_name} has ${rows.length} recorded events`,
        data: {
          productCode: productCode,
          productName: rows[0].product_name,
          code: productCode,
          historyCount: rows.length,
          history: rows.map(row => ({
            date: row.transaction_date,
            quantity: row.quantity,
            price: row.unit_price,
            type: row.transaction_type || "Unknown"
          }))
        }
      });
    } catch (error) {
      logger.error("product-history: BigQuery error:", error);
      res.status(500).json({
        success: false,
        message: "Error retrieving product history data",
        error: error.message
      });
    }
  }
);

// 2. Get most popular product with customers
app.get(
  "/most-popular-product",
  async (req, res) => {
    logger.info("most-popular-product: Executing query");

    const query = `
    SELECT product_id, sku, product_name, product_code, COUNT(*) as transaction_count
    FROM \`${datasetId}.${tableId}\`
    GROUP BY product_id, sku, product_name, product_code
    ORDER BY transaction_count DESC
    LIMIT 1
    `;

    try {
      const [rows] = await bigquery.query(query);

      if (rows.length === 0) {
        return res.json({
          success: false,
          message: "No product history records found in the system"
        });
      }

      res.status(200).json({
        success: true,
        message: `${rows[0].product_name} is the most popular product with ${rows[0].transaction_count} customer interactions`,
        data: {
          productId: rows[0].product_id,
          sku: rows[0].sku,
          productName: rows[0].product_name,
          productCode: rows[0].product_code,
          transactionCount: rows[0].transaction_count
        }
      });
    } catch (error) {
      logger.error("most-popular-product: BigQuery error:", error);
      res.status(500).json({
        success: false,
        message: "Error retrieving product popularity data",
        error: error.message
      });
    }
  }
);

// 3. Get this month's best seller
app.get(
  "/monthly-best-seller",
  async (req, res) => {
    logger.info("monthly-best-seller: Executing query for current month");

    const query = `
    SELECT product_id, sku, product_name, COUNT(*) as transaction_count
    FROM \`${datasetId}.${tableId}\`
    WHERE EXTRACT(MONTH FROM transaction_date) = EXTRACT(MONTH FROM CURRENT_DATE())
    AND EXTRACT(YEAR FROM transaction_date) = EXTRACT(YEAR FROM CURRENT_DATE())
    GROUP BY product_id, sku, product_name
    ORDER BY transaction_count DESC
    LIMIT 1
    `;

    try {
      const [rows] = await bigquery.query(query);

      if (rows.length === 0) {
        return res.json({
          success: false,
          message: "No sales recorded for the current month"
        });
      }

      // Get current month name
      const monthNames = [
        "January", "February", "March", "April", "May", "June",
        "July", "August", "September", "October", "November", "December"
      ];
      const currentMonth = monthNames[new Date().getMonth()];

      res.status(200).json({
        success: true,
        message: `${rows[0].product_name} is our best seller this month (${currentMonth}) with ${rows[0].transaction_count} sales`,
        data: {
          month: currentMonth,
          productId: rows[0].product_id,
          sku: rows[0].sku,
          productName: rows[0].product_name,
          transactionCount: rows[0].transaction_count
        }
      });
    } catch (error) {
      logger.error("monthly-best-seller: BigQuery error:", error);
      res.status(500).json({
        success: false,
        message: "Error retrieving monthly sales data",
        error: error.message
      });
    }
  }
);

// 4. Get store inventory value and top products
app.get(
  "/store-inventory",
  async (req, res) => {
    logger.info("store-inventory: Executing inventory value query");

    const query = `
    WITH current_inventory AS (
      SELECT product_id, product_name, sku, product_code,
        SAFE_CAST(current_stock AS FLOAT64) as stock_qty,
        SAFE_CAST(unit_price AS FLOAT64) as item_price,
        (SAFE_CAST(current_stock AS FLOAT64) * SAFE_CAST(unit_price AS FLOAT64)) as total_value
      FROM \`${datasetId}.${tableId}\`
      WHERE SAFE_CAST(current_stock AS FLOAT64) > 0
        AND SAFE_CAST(unit_price AS FLOAT64) > 0
    )

    SELECT 
      (SELECT SUM(total_value) FROM current_inventory) as total_inventory_value,
      ARRAY(
        SELECT AS STRUCT product_id, product_name, sku, product_code, stock_qty, item_price, total_value
        FROM current_inventory
        ORDER BY total_value DESC
        LIMIT 5
      ) as top_valuable_items
    `;

    try {
      const [rows] = await bigquery.query(query);

      if (rows.length === 0) {
        return res.json({
          success: false,
          message: "No inventory data found in the system"
        });
      }

      const formatter = new Intl.NumberFormat('en-US', {
        style: 'currency',
        currency: 'USD'
      });

      const totalValue = formatter.format(rows[0].total_inventory_value);

      const topItems = rows[0].top_valuable_items.map(item => ({
        productId: item.product_id,
        sku: item.sku,
        name: item.product_name,
        code: item.product_code,
        quantity: item.stock_qty,
        unitPrice: formatter.format(item.item_price),
        totalValue: formatter.format(item.total_value)
      }));

      res.status(200).json({
        success: true,
        message: `Our store's current inventory is valued at ${totalValue}`,
        data: {
          totalInventoryValue: totalValue,
          topItems: topItems
        }
      });
    } catch (error) {
      logger.error("store-inventory: BigQuery error:", error);
      res.status(500).json({
        success: false,
        message: "Error retrieving store inventory data",
        error: error.message
      });
    }
  }
);

// Additional endpoints to implement:
// - '/trending-products' - Products with increasing customer interest
// - '/product-recommendations' - AI-based product recommendations
// - '/low-stock-alerts' - Identify products that need replenishment

// Error handling middleware
app.use((err, req, res, next) => {
  logger.error("Global error handler:", err);
  res
    .status(500)
    .json({
      success: false,
      message: "Something went wrong with your request",
      error: err.message
    });
});

// Export the API as
module.exports = { storeAPI: app }

Step 4: Integrating AI with Vertex AI for the Store Assistant Bot

Now for the exciting part! Let's add the AI layer using Google's Vertex AI to create a conversational store assistant bot that can answer customer questions.

Create another file storeBot.js in the functions folder with express app and export it at the end.

const express = require('express');
const axios = require('axios');
const { VertexAI } = require('@google-cloud/vertexai');
const app = express();
app.use(express.json());

// Configuration
const API_BASE_URL = 'https://your-region-your-project.cloudfunctions.net/api';
const PROJECT_ID = "your-project-id";
const LOCATION = 'us-central1';
const MODEL_ID = 'gemini-1.5-pro'; // Or the latest available model

// Create VertexAI instance
const vertexAI = new VertexAI({project: PROJECT_ID, location: LOCATION});
const generativeModel = vertexAI.getGenerativeModel({
  model: MODEL_ID,
  generationConfig: {
    temperature: 0.2, 
    maxOutputTokens: 1024,
  }
});

// Define the API endpoints map
const API_ENDPOINTS = {
  productHistory: `${API_BASE_URL}/product-history`,
  mostPopularProduct: `${API_BASE_URL}/most-popular-product`,
  monthlyBestSeller: `${API_BASE_URL}/monthly-best-seller`,
  storeInventory: `${API_BASE_URL}/store-inventory`,
  trendingProducts: `${API_BASE_URL}/trending-products`,
  productRecommendations: `${API_BASE_URL}/product-recommendations`,
  lowStockAlerts: `${API_BASE_URL}/low-stock-alerts`
};

// Helper function to make API calls
async function callAPI(endpoint, params = {}) {
  try {
    let url = API_ENDPOINTS[endpoint];
    if (!url) {
      throw new Error(`Unknown endpoint: ${endpoint}`);
    }

    // Add query parameters if provided
    if (Object.keys(params).length > 0) {
      const queryParams = new URLSearchParams();
      for (const [key, value] of Object.entries(params)) {
        queryParams.append(key, value);
      }
      url = `${url}?${queryParams.toString()}`;
    }

    const response = await axios.get(url);
    return response.data;
  } catch (error) {
    console.error(`Error calling ${endpoint}:`, error.message);
    if (error.response) {
      return error.response.data;
    }
    return { success: false, message: error.message };
  }
}

// Function to analyze customer query and determine API endpoint
async function analyzeQuery(query) {
  try {
    const prompt = `
    You are a helpful store assistant bot. Your task is to analyze the customer's query and determine which API endpoint should be called, along with any required parameters.

    Available endpoints:
    1. productHistory - Requires productCode parameter (for queries about history or information for a specific product)
    2. mostPopularProduct - No parameters needed (for queries about which product is most popular overall)
    3. monthlyBestSeller - No parameters needed (for queries about best sellers this month)
    4. storeInventory - No parameters needed (for queries about store inventory value or available products)
    5. trendingProducts - Optional 'days' parameter (default 30) for queries about currently trending products
    6. productRecommendations - Requires productId parameter (for queries about recommended related products)
    7. lowStockAlerts - No parameters needed (for queries about products that need to be restocked)

    Customer query: "${query}"

    Respond with a valid JSON object containing:
    1. "endpoint": The API endpoint that should be called
    2. "params": An object containing any parameters needed for the API call
    3. "explanation": Brief explanation of why this endpoint was selected
    `;

    try {
      const result = await generativeModel.generateContent({
        contents: [{ role: "user", parts: [{ text: prompt }] }]
      });

      const responseText = result.response.candidates[0].content.parts[0].text;

/*===================
 You can see excessive use of '\', it is not part of code but to avoid markdown syntax errors in dev.to blog
==================*/

       // Extract the JSON object from the response
      let jsonMatch = responseText.match(/\`\`\`json\n([\s\S]*?)\n\`\`\`/) || 
                responseText.match(/\`\`\`\n([\s\S]*?)\n\`\`\`/) ||
                responseText.match(/{[\s\S]*?}/);

      let jsonStr = jsonMatch ? jsonMatch[0] : responseText;

      // Clean up the string to ensure it's valid JSON
      jsonStr = jsonStr.replace(/`\`\`json\n|`\`\`\n|`\`\`/g, '').trim();




      return JSON.parse(jsonStr);
    } catch (error) {
      console.error('Vertex AI error:', error);
      // Fall back to basic intent matching
      return fallbackQueryAnalysis(query);
    }
  } catch (error) {
    console.error('Error analyzing query:', error);
    return {
      endpoint: null,
      params: {},
      explanation: `Error analyzing query:+ ${error.message}`
    };
  }
}

// Function to format API response into natural language for the store assistant bot
async function formatResponse(apiResponse, userQuery, endpoint) {
  try {
    const prompt = `
    You are a helpful store assistant bot. Your task is to format the API response into a natural, conversational response that addresses the customer's query.

    Customer query: "${userQuery}"

    API endpoint used: ${endpoint}

    API response: ${JSON.stringify(apiResponse)}

    Format your response in a natural, conversational way that:
    1. Directly answers the customer's question
    2. Highlights the most important information from the API response
    3. Provides context where helpful
    4. Is concise but complete
    5. Sounds like a helpful store assistant, not like a computer

    Your response should be friendly and helpful, like a knowledgeable store employee.
    `;

    const result = await generativeModel.generateContent({
      contents: [{ role: "user", parts: [{ text: prompt }] }]
    });

    return result.response.candidates[0].content.parts[0].text;
  } catch (error) {
    console.error('Error formatting response:', error);
    return `I found some information about that, but I'm having trouble putting it into words. Here's what I know: ${JSON.stringify(apiResponse.data || apiResponse)}`;
  }
}

// Main endpoint to process customer queries
app.post('/query', async (req, res) => {
  try {
    const { query } = req.body;

    if (!query) {
      return res.status(400).json({ 
        success: false, 
        message: 'Please provide a question about our products or inventory' 
      });
    }

    // Step 1: Analyze the query to determine the API endpoint
    const analysis = await analyzeQuery(query);

    // Step 2: If we can't determine the endpoint, ask for more information
    if (!analysis.endpoint) {
      return res.json({
        success: true,
        response: "I'm not sure what product information you're looking for. Could you be more specific? For example, you can ask about a specific product, our best sellers, or what's in stock."
      });
    }

    // Step 3: Call the appropriate API endpoint
    const apiResponse = await callAPI(analysis.endpoint, analysis.params);

    // Step 4: Format the response into natural language
    const formattedResponse = await formatResponse(apiResponse, query, analysis.endpoint);

    // Step 5: Return the formatted response
    res.json({
      success: true,
      response: formattedResponse,
      raw: {
        analysis: analysis,
        apiResponse: apiResponse
      }
    });

  } catch (error) {
    console.error('Error processing query:', error);
    res.status(500).json({ 
      success: false, 
      message: "I'm having trouble understanding your question right now. Could you try asking it differently?",
      error: error.message
    });
  }
});

// Export the Bot APP as
module.exports = { storeBot: app }

Step 5: Deploying Google cloud functions

Now inside index.js file in your functions directory:
import all the storeAPI and storeBot.

const admin = require("firebase-admin");
const { onRequest } = require("firebase-functions/v2/https");
// Initialize Firebase App
admin.initializeApp();

const storeAPI = require('./storeAPI.js');
const storeBot = require('./storeBot.js');

//Using onRequest for both apps
exports.api = onRequest(storeAPI);
exports.bot = onRequest(storeBot);

Now deploy to Firebase with cli:

firebase deploy --only functions:api,functions:bot

Step 6: Testing and Using the Store Assistant Bot

Once deployed, you can integrate the bot into your store through:

  • Direct API calls to endpoints for specific product information
  • The conversational bot interface for natural language customer interactions

Example usage of the Store Assistant Bot:

// Via curl
curl -X POST https://your-region-your-project.cloudfunctions.net/bot/query \
  -H "Content-Type: application/json" \
  -d '{"query":"Do you have the wireless headphones in stock?"}'

// Example response
{
  "success": true,
  "response": "Yes, we currently have Premium Wireless Headphones in stock! We have 50 units available at $349.99 each. Would you like me to tell you more about their features?"
}

Advanced Features to Consider

Now that you have the basic store assistant bot working, here are some enhancements to consider:

  • Real-time updates: Set up a Firestore trigger to automatically update BigQuery when inventory changes
  • Customer preferences: Track and remember customer preferences to personalize recommendations
  • Enhanced bot capabilities: Implement follow-up questions and conversation memory
  • Sales predictions: Use historical data to predict popular products and adjust inventory
  • Voice interface: Connect to voice assistants for hands-free product queries in the store

Conclusion

We've built a powerful store assistant bot that combines the operational simplicity of Firebase with the analytical capabilities of BigQuery and the conversational intelligence of Vertex AI. This system allows store employees and customers to get instant answers about products using natural language questions instead of having to search through catalogs or ask multiple staff members.

The combination of these technologies provides several benefits:

  • Cost-effective: BigQuery's pay-per-query model keeps costs low
  • Scalable: Handles thousands of products and customer queries without performance issues
  • User-friendly: Natural language interface makes product information accessible to everyone
  • Customer satisfaction: Provides immediate answers to product questions, improving the shopping experience

I hope this tutorial helps you build your own intelligent store assistant bot! Feel free to reach out with questions in the comments.