ChatBees
  • 👋Welcome
  • Concepts
    • 📖Architecture
    • 📖Security
    • 📖Serverless RAG
    • 📖Namespace and Collection
    • 📖Access Control
    • 🚀Ticket AI Agent
  • Ticket AI Agent
    • Installation
  • WEB APP REFERENCES
    • 🔑Sign-in and Sign-out
    • ⛓️Manage Connectors
    • 🌏Manage Collections
      • 💿Data Sources
        • 📖Configure Periodic Import
      • ❓Chat with collection
      • ⏱️View Q/A history
      • 📖Publish a Collection
    • 🏛️Manage Users
    • 💰[Flex] Billing and Payment
    • 💰[Enterprise] Billing and Payment
    • 📈Account Usage
    • 🗝️API Keys
    • 🖥️Generated Code Sample
  • ChatBots
    • 🪄AI Chat for Confluence
    • 🪄ChatBees Slack Bot
    • 🪄ChatBees Website ChatBot
    • 🪄Pnyx Discord Bot
  • Snowflake Native App
  • API References
    • 📖API Key
    • 📖Collection Operations
      • 📖Create Collection
      • 📖Configure Collection
      • 📖List Collections
      • 📖Delete Collection
    • 📖Document Operations
      • 📖Upload Document
      • 📖Summarize Document
      • 📖Get Document Outlines and FAQs
      • 📖Ask
      • 📖Chat
      • 📖Search
      • 📖Personalize Response
      • 📖List Documents
      • 📖Delete Document
    • 📖Crawl Operations
      • 📖Create Crawl
      • 📖Get Crawl
      • 📖Index Crawl
      • 📖Delete Crawl
    • 📖Ingest Data Sources
      • 📖Create Ingestion
      • 📖Get Ingestion
      • 📖Index Ingestion
      • 📖Delete Ingestion
Powered by GitBook
On this page
  1. WEB APP REFERENCES
  2. Manage Collections
  3. Data Sources

Configure Periodic Import

Periodically import data from configured data sources

PreviousData SourcesNextChat with collection

Last updated 11 months ago

After you configure the ingestion spec, such as a Confluence CQL, and click Ingest. The periodic config window will show up. You can configure the frequency, start time and time zone, such as 12:00AM weekly on Sunday. ChatBees will automatically schedule the ingestion task and keep the Collection up-to-date with the data source.

Currently only one periodic import is supported per data source. For example, if you crawl two websites to a collection, you can only configure periodic import for one website. The latest config will overwrite the previous config.

🌏
💿
📖