@@ -11,8 +11,12 @@ A high-performance changelog generator for Git repositories that automatically c
11
11
- ** Unreleased changes** : Tracks all commits since the last release
12
12
- ** Concurrent processing** : Parallel GitHub API calls for improved performance
13
13
- ** Flexible output** : Generate complete changelogs or target specific versions
14
- - ** Optimized PR fetching ** : Batch fetches all merged PRs using GitHub Search API (drastically reduces API calls )
14
+ - ** GraphQL optimization ** : Ultra-fast PR fetching using GitHub GraphQL API (~ 5-10 calls vs 1000s )
15
15
- ** Intelligent sync** : Automatically syncs new PRs every 24 hours or when missing PRs are detected
16
+ - ** AI-powered summaries** : Optional Fabric integration for enhanced changelog summaries
17
+ - ** Advanced caching** : Content-based change detection for AI summaries with hash comparison
18
+ - ** Author type detection** : Distinguishes between users, bots, and organizations
19
+ - ** Lightning-fast incremental updates** : SHA→PR mapping for instant git operations
16
20
17
21
## Installation
18
22
@@ -23,26 +27,31 @@ go install github.com/danielmiessler/fabric/cmd/generate_changelog@latest
23
27
## Usage
24
28
25
29
### Basic usage (generate complete changelog)
30
+
26
31
``` bash
27
32
generate_changelog
28
33
```
29
34
30
35
### Save to file
36
+
31
37
``` bash
32
38
generate_changelog -o CHANGELOG.md
33
39
```
34
40
35
41
### Generate for specific version
42
+
36
43
``` bash
37
44
generate_changelog -v v1.4.244
38
45
```
39
46
40
47
### Limit to recent versions
48
+
41
49
``` bash
42
50
generate_changelog -l 10
43
51
```
44
52
45
53
### Using GitHub token for private repos or higher rate limits
54
+
46
55
``` bash
47
56
export GITHUB_TOKEN=your_token_here
48
57
generate_changelog
@@ -51,7 +60,18 @@ generate_changelog
51
60
generate_changelog --token your_token_here
52
61
```
53
62
63
+ ### AI-enhanced summaries
64
+
65
+ ``` bash
66
+ # Enable AI summaries using Fabric
67
+ generate_changelog --ai-summarize
68
+
69
+ # Use custom model for AI summaries
70
+ FABRIC_CHANGELOG_SUMMARIZE_MODEL=claude-opus-4 generate_changelog --ai-summarize
71
+ ```
72
+
54
73
### Cache management
74
+
55
75
``` bash
56
76
# Rebuild cache from scratch
57
77
generate_changelog --rebuild-cache
@@ -80,6 +100,7 @@ generate_changelog --cache /path/to/cache.db
80
100
| ` --rebuild-cache ` | | Rebuild cache from scratch | false |
81
101
| ` --force-pr-sync ` | | Force a full PR sync from GitHub | false |
82
102
| ` --token ` | | GitHub API token | ` $GITHUB_TOKEN ` |
103
+ | ` --ai-summarize ` | | Generate AI-enhanced summaries using Fabric | false |
83
104
84
105
## Output Format
85
106
@@ -120,59 +141,123 @@ The generated changelog follows this structure:
120
141
- ** Concurrent API calls** : Processes up to 10 GitHub API requests in parallel
121
142
- ** Smart caching** : SQLite cache eliminates redundant API calls
122
143
- ** Incremental updates** : Only processes new commits on subsequent runs
123
- - ** Batch PR fetching** : Uses GitHub Search API to fetch all merged PRs in minimal API calls
144
+ - ** GraphQL optimization** : Uses GitHub GraphQL API to fetch all PR data in ~ 5-10 calls
145
+ - ** AI-powered summaries** : Optional Fabric integration with intelligent caching
146
+ - ** Content-based change detection** : AI summaries only regenerated when content changes
147
+ - ** Lightning-fast git operations** : SHA→PR mapping stored in database for instant lookups
148
+
149
+ ### Major Optimization: GraphQL + Advanced Caching
124
150
125
- ### Major Optimization: Batch PR Fetching
151
+ The tool has been optimized to drastically reduce GitHub API calls and improve performance:
126
152
127
- The tool has been optimized to drastically reduce GitHub API calls:
153
+ ** Previous approach ** : Individual API calls for each PR (2 API calls per PR)
128
154
129
- ** Before** : Individual API calls for each PR (2 API calls per PR - one for PR details, one for commits)
130
155
- For a repo with 500 PRs: 1,000 API calls
131
156
132
- ** After** : Batch fetching using GitHub Search API
133
- - For a repo with 500 PRs: ~ 10 API calls (search) + 500 API calls (details) = ~ 510 API calls
134
- - ** 50% reduction in API calls!**
157
+ ** Current approach** : GraphQL batch fetching with intelligent caching
158
+
159
+ - For a repo with 500 PRs: ~ 5-10 GraphQL calls (initial fetch) + 0 calls (subsequent runs with cache)
160
+ - ** 99%+ reduction in API calls after initial run!**
135
161
136
162
The optimization includes:
137
- 1 . ** Batch Search** : Uses GitHub's Search API to find all merged PRs in paginated batches
138
- 2 . ** Smart Caching** : Stores complete PR data and tracks last sync timestamp
139
- 3 . ** Incremental Sync** : Only fetches PRs merged after the last sync
163
+
164
+ 1 . ** GraphQL Batch Fetch** : Uses GitHub's GraphQL API to fetch all merged PRs with commits in minimal calls
165
+ 2 . ** Smart Caching** : Stores complete PR data, commits, and SHA mappings in SQLite
166
+ 3 . ** Incremental Sync** : Only fetches PRs merged after the last sync timestamp
140
167
4 . ** Automatic Refresh** : PRs are synced every 24 hours or when missing PRs are detected
141
- 5 . ** Fallback Support** : If batch fetch fails, falls back to individual PR fetching
168
+ 5 . ** AI Summary Caching** : Content-based change detection prevents unnecessary AI regeneration
169
+ 6 . ** Fallback Support** : If GraphQL fails, falls back to REST API batch fetching
170
+ 7 . ** Lightning Git Operations** : Pre-computed SHA→PR mappings for instant commit association
142
171
143
172
## Requirements
144
173
145
174
- Go 1.24+ (for installation from source)
146
175
- Git repository
147
176
- GitHub token (optional, for private repos or higher rate limits)
177
+ - Fabric CLI (optional, for AI-enhanced summaries)
148
178
149
179
## Authentication
150
180
151
181
The tool supports GitHub authentication via:
182
+
152
183
1 . Environment variable: ` export GITHUB_TOKEN=your_token `
153
184
2 . Command line flag: ` --token your_token `
185
+ 3 . ` .env ` file in the same directory as the binary
186
+
187
+ ### Environment File Support
188
+
189
+ Create a ` .env ` file next to the ` generate_changelog ` binary:
190
+
191
+ ``` bash
192
+ GITHUB_TOKEN=your_github_token_here
193
+ FABRIC_CHANGELOG_SUMMARIZE_MODEL=claude-sonnet-4-20250514
194
+ ```
195
+
196
+ The tool automatically loads ` .env ` files for convenient configuration management.
154
197
155
198
Without authentication, the tool is limited to 60 GitHub API requests per hour.
156
199
157
200
## Caching
158
201
159
202
The SQLite cache stores:
203
+
160
204
- Version information and commit associations
161
205
- Pull request details (title, body, commits, authors)
162
206
- Last processed commit SHA for incremental updates
163
207
- Last PR sync timestamp for intelligent refresh
208
+ - AI summaries with content-based change detection
209
+ - SHA→PR mappings for lightning-fast git operations
164
210
165
211
Cache benefits:
212
+
166
213
- Instant changelog regeneration
167
- - Drastically reduced GitHub API usage (50 %+ reduction)
214
+ - Drastically reduced GitHub API usage (99 %+ reduction after initial run )
168
215
- Offline changelog generation (after initial cache build)
169
216
- Automatic PR data refresh every 24 hours
170
217
- Batch database transactions for better performance
218
+ - Content-aware AI summary regeneration
219
+
220
+ ## AI-Enhanced Summaries
221
+
222
+ The tool can generate AI-powered summaries using Fabric for more polished, professional changelogs:
223
+
224
+ ``` bash
225
+ # Enable AI summarization
226
+ generate_changelog --ai-summarize
227
+
228
+ # Custom model (default: claude-sonnet-4-20250514)
229
+ FABRIC_CHANGELOG_SUMMARIZE_MODEL=claude-opus-4 generate_changelog --ai-summarize
230
+ ```
231
+
232
+ ### AI Summary Features
233
+
234
+ - ** Content-based change detection** : AI summaries are only regenerated when version content changes
235
+ - ** Intelligent caching** : Preserves existing summaries and only processes changed versions
236
+ - ** Content hash comparison** : Uses SHA256 hashing to detect when "Unreleased" content changes
237
+ - ** Automatic fallback** : Falls back to raw content if AI processing fails
238
+ - ** Error detection** : Identifies and handles AI processing errors gracefully
239
+ - ** Minimum content filtering** : Skips AI processing for very brief content (< 256 characters)
240
+
241
+ ### AI Model Configuration
242
+
243
+ Set the model via environment variable:
244
+
245
+ ``` bash
246
+ export FABRIC_CHANGELOG_SUMMARIZE_MODEL=claude-opus-4
247
+ # or
248
+ export FABRIC_CHANGELOG_SUMMARIZE_MODEL=gpt-4
249
+ ```
250
+
251
+ AI summaries are cached and only regenerated when:
252
+
253
+ - Version content changes (detected via hash comparison)
254
+ - No existing AI summary exists for the version
255
+ - Force rebuild is requested
171
256
172
257
## Contributing
173
258
174
259
This tool is part of the Fabric project. Contributions are welcome!
175
260
176
261
## License
177
262
178
- Same as the Fabric project.
263
+ The MIT License. Same as the Fabric project.
0 commit comments