Skip to content
Jacob Kirby edited this page May 6, 2025 · 2 revisions

πŸ“š AI-Thinking-Module Wiki

Welcome to the AI-Thinking-Module project wiki. This system continuously collects, stores, and analyzes artificial intelligence news using a hybrid pipeline of RSS scraping, NewsAPI aggregation, and LLM-powered narrative interpretation via GROQ.


πŸ“– Table of Contents

  1. Overview
  2. Getting Started
  3. System Architecture
  4. Core Concepts
  5. API Integration
  6. Usage Example
  7. Roadmap
  8. License

🧠 Overview

The AI-Thinking-Module simulates the flow of consciousness in an artificial mind by interpreting real-world news about AI. It:

  • Aggregates news via RSS, HTML scraping, and NewsAPI.
  • Stores data in structured JSON snapshots.
  • Sends collected articles to the GROQ API (LLaMA 3) for interpretive, introspective analysis.
  • Outputs a creative "thought stream" β€” not summaries β€” mimicking human reflection.

πŸš€ Getting Started

Prerequisites

Installation

git clone https://github.com/your-username/AI-Thinking-Module.git
cd AI-Thinking-Module
pip install -r requirements.txt

πŸ”§ Configuration

Edit your API keys in the script:

NEWS_API_KEY = "INSERT_NEWS_API_KEY_HERE"
GROQ_API_KEY = "INSERT_GROQ_API_KEY_HERE"

βš™οΈ System Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ RSS/HTML │─────▢│ News Collector│─────▢│ JSON Article Dump │─────▢│ GROQ LLM API β”‚ β”‚ + NewsAPI β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β–Ό β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Thought Log β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

  • Scraper Layer: Gathers data from 5+ sources (TechCrunch, Wired, etc.)
  • Storage Layer: Saves articles in timestamped JSON files
  • Reasoning Layer: GROQ LLM processes articles as introspective "thoughts"

πŸ€– Core Concepts

Continuous News Ingestion

  • RSS via feedparser
  • HTML via BeautifulSoup
  • NewsAPI JSON via REST

Human-like Contemplation

Instead of summarizing, the GROQ model is prompted to:

  • Contemplate connections between ideas
  • Form introspective or speculative thoughts
  • Drift like human cognition, avoiding hard conclusions

Fault Tolerance

  • Retry logic with exponential backoff for GROQ API
  • SSL warnings suppressed for robust scraping

πŸ”Œ API Integration

GROQ LLM (LLaMA 3)

  • Endpoint: https://api.groq.com/openai/v1/chat/completions
  • Model: llama3-8b-8192
  • Max Tokens: 2048
  • Temperature: 0.85

NewsAPI

  • Endpoint: https://newsapi.org/v2/everything
  • Filters: artificial intelligence keywords
  • Max articles: 5 per run

πŸ’» Usage Example

python ai_thinking_pipeline.py

πŸ“… Roadmap

  • Add database or vector store backend (e.g., SQLite, Pinecone)
  • Integrate open-source LLMs as fallback (e.g., Ollama, LM Studio)
  • Stream to a web dashboard (FastAPI + frontend)
  • Weekly digest mode (email output)

πŸ“„ License

MIT License. See LICENSE for details.