Skip to content

Commit 630e06f

Browse files
authored
Refactor Directories for Readability and Maintainability and Documentation Updated for Responsive Design Update (#59)
* Documentation updated to match newer patterns * Update CONTRIBUTING to match new patterns better * Move directories into new organized structure
1 parent c5f2f71 commit 630e06f

File tree

121 files changed

+3562
-3612
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

121 files changed

+3562
-3612
lines changed

CONTRIBUTING.md

Lines changed: 20 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -40,10 +40,9 @@ This project and everyone participating in it is governed by the [WebUI Code of
4040
### Pull Requests
4141

4242
1. Fork the repository
43-
2. Create your feature branch from the `development` branch, (`git checkout -b feature/amazing-feature`)
44-
3. Commit your changes following conventional commit messages (`git commit -m 'feat: add some amazing feature'`)
45-
4. Push to the branch (`git push origin feature/amazing-feature`)
46-
5. Open a Pull Request to the `development` branch.
43+
2. Create your feature branch from the `development` branch, (`git checkout -b feature-00-amazing-feature`)
44+
3. Push to the branch (`git push origin feature-00-amazing-feature`)
45+
4. Open a Pull Request to the `development` branch.
4746

4847
## Development Process
4948

@@ -66,9 +65,9 @@ WebUI maintains three primary branches:
6665

6766
Version bumps are triggered automatically via commit messages. Use the following prefixes:
6867

69-
- `feat!:` - Major version increment for breaking changes (e.g., `1.0.0``2.0.0`).
70-
- `feat:` - Minor version increment for new features (e.g., `1.0.0``1.1.0`).
71-
- `fix:` or `fix!:` - Patch version increment for bug fixes (e.g., `1.0.0``1.0.1`).
68+
- `feat!:` or `Release` - Major version increment for breaking changes (e.g., `1.0.0``2.0.0`).
69+
- `feat:` or `Feature` - Minor version increment for new features (e.g., `1.0.0``1.1.0`).
70+
- `fix:`, `fix!:` or `Fixes` - Patch version increment for bug fixes (e.g., `1.0.0``1.0.1`).
7271

7372
### Release Process
7473

@@ -79,23 +78,11 @@ The release workflow is fully automated:
7978
3. Release notes are automatically generated from the commit messages.
8079
4. A new tag is created and the release is published on GitHub.
8180

82-
### Quick Fixes (Hotfixes)
83-
84-
For urgent fixes that need to be pushed to `main` right away:
85-
86-
1. Create a PR targeting the `main` branch.
87-
2. Include `fix!:` in the PR title and merge message.
88-
3. Once approved and merged, an action will automatically create PRs for `next` and `development` branches.
89-
4. This ensures all branches remain in sync when quick changes are required in the main branch.
90-
91-
> [!WARNING]
92-
> Ensure the auto-generated PRs are approved and merged promptly to maintain branch synchronization.
93-
9481
### Testing
9582

9683
Automated tests run on every pull request to `main` and `next` branches:
9784

98-
1. Tests are executed in a macOS environment with Swift 6.1.
85+
1. Tests are executed in a macOS environment.
9986
2. The workflow includes caching of Swift Package Manager dependencies for faster builds.
10087
3. All tests must pass before a PR can be merged.
10188

@@ -116,6 +103,9 @@ WebUI uses Swift DocC for documentation:
116103
swift package --disable-sandbox preview-documentation
117104
```
118105

106+
> [!NOTE]
107+
> You can also run `Build Documentation` inside of Xcode to view the documentation in
108+
119109
### Adding New Elements
120110

121111
WebUI follows a compositional pattern for creating HTML elements. When adding a new element, adhere to these guidelines:
@@ -139,12 +129,12 @@ WebUI follows a compositional pattern for creating HTML elements. When adding a
139129
/// Defines the types available for the element.
140130
///
141131
/// Detailed documentation about the enum and its purpose.
142-
public enum ElementType: String {
132+
public enum ElementCustom: String {
143133
/// Documentation for this case.
144-
case primary
134+
case one
145135

146136
/// Documentation for this case.
147-
case secondary
137+
case two
148138
}
149139
```
150140

@@ -155,12 +145,12 @@ WebUI follows a compositional pattern for creating HTML elements. When adding a
155145
/// Detailed documentation about what this element represents and its use cases.
156146
public final class ElementName: Element {
157147
// Properties specific to this element
158-
let type: ElementType?
148+
let customType: ElementCustom?
159149

160150
/// Creates a new HTML element_name.
161151
///
162152
/// - Parameters:
163-
/// - type: Type of the element, optional.
153+
/// - custom: An example custom attribute, optional.
164154
/// - id: Unique identifier, optional.
165155
/// - classes: CSS class names, optional.
166156
/// - role: ARIA role for accessibility, optional.
@@ -187,7 +177,7 @@ WebUI follows a compositional pattern for creating HTML elements. When adding a
187177

188178
// Build custom attributes using Attr namespace
189179
let customAttributes = [
190-
Attribute.typed("type", type)
180+
Attribute.typed("custom", custom) // will generate as `custom="\(custom)"`
191181
].compactMap { $0 }
192182

193183
// Initialize the parent Element class
@@ -205,14 +195,17 @@ WebUI follows a compositional pattern for creating HTML elements. When adding a
205195
}
206196
```
207197

208-
4. **Testing**: Add unit tests for the new element in the `Tests` directory.
198+
4. **Testing**: Add unit tests for the new element in the `Tests/Styles` directory.
209199

210200
5. **Documentation**: Include comprehensive DocC documentation with:
211201
- Class-level documentation explaining the element's purpose
212202
- Parameter documentation for all initializer parameters
213203
- Usage examples showing common implementations
214204
- Mention any accessibility considerations
215205

206+
> [!IMPORTANT]
207+
> Pull requests with new elements, modifiers and utilities will be rejected or put on hold until adequate documentation is provided. This is extemely important for both the end user of the library to understand what each element does and means semantically as well as ensuring maintainability for the maintainers of the project.
208+
216209
## Adding New Style Modifiers
217210

218211
Style modifiers in WebUI follow the unified style system pattern. Here's how to add a new style modifier:

Sources/WebUI/Core/Metadata/Metadata.swift renamed to Sources/WebUI/Core/Infrastructure/Metadata/Metadata.swift

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -142,6 +142,8 @@ public struct Metadata {
142142
/// - locale: Override for the language locale.
143143
/// - type: Override for the content type.
144144
/// - themeColor: Override for the theme color.
145+
/// - favicons: Override for the favicons.
146+
/// - structuredData: Override for the structured data.
145147
///
146148
/// - Example:
147149
/// ```swift
Lines changed: 80 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,80 @@
1+
import Foundation
2+
3+
/// Provides functionality for generating robots.txt files.
4+
///
5+
/// The `Robots` struct offers methods for creating standards-compliant robots.txt files
6+
/// that provide instructions to web crawlers about which parts of a website they can access.
7+
public struct Robots {
8+
9+
/// Generates a robots.txt file content.
10+
///
11+
/// This method creates a standard robots.txt file that includes instructions for web crawlers,
12+
/// including a reference to the sitemap if one exists.
13+
///
14+
/// - Parameters:
15+
/// - baseURL: The optional base URL of the website (e.g., "https://example.com").
16+
/// - generateSitemap: Whether a sitemap is being generated for this website.
17+
/// - robotsRules: Custom rules to include in the robots.txt file.
18+
/// - Returns: A string containing the content of the robots.txt file.
19+
///
20+
/// - Example:
21+
/// ```swift
22+
/// let content = Robots.generateTxt(
23+
/// baseURL: "https://example.com",
24+
/// generateSitemap: true,
25+
/// robotsRules: [.allowAll()]
26+
/// )
27+
/// ```
28+
///
29+
/// - Note: If custom rules are provided, they will be included in the file.
30+
/// Otherwise, a default permissive robots.txt will be generated.
31+
public static func generateTxt(
32+
baseURL: String? = nil,
33+
generateSitemap: Bool = false,
34+
robotsRules: [RobotsRule]? = nil
35+
) -> String {
36+
var contentComponents = ["# robots.txt generated by WebUI\n"]
37+
38+
if let rules = robotsRules, !rules.isEmpty {
39+
// Add each custom rule
40+
for rule in rules {
41+
contentComponents.append(formatRule(rule))
42+
}
43+
} else {
44+
// Default permissive robots.txt
45+
contentComponents.append("User-agent: *\nAllow: /\n")
46+
}
47+
48+
// Add sitemap reference if applicable
49+
if generateSitemap, let baseURL = baseURL {
50+
contentComponents.append("Sitemap: \(baseURL)/sitemap.xml")
51+
}
52+
53+
return contentComponents.joined(separator: "\n")
54+
}
55+
56+
/// Formats a single robots rule as a string.
57+
///
58+
/// - Parameter rule: The robots rule to format.
59+
/// - Returns: A string representation of the rule.
60+
private static func formatRule(_ rule: RobotsRule) -> String {
61+
var ruleComponents = ["User-agent: \(rule.userAgent)"]
62+
63+
// Add disallow paths
64+
if let disallow = rule.disallow, !disallow.isEmpty {
65+
ruleComponents.append(contentsOf: disallow.map { "Disallow: \($0)" })
66+
}
67+
68+
// Add allow paths
69+
if let allow = rule.allow, !allow.isEmpty {
70+
ruleComponents.append(contentsOf: allow.map { "Allow: \($0)" })
71+
}
72+
73+
// Add crawl delay if provided
74+
if let crawlDelay = rule.crawlDelay {
75+
ruleComponents.append("Crawl-delay: \(crawlDelay)")
76+
}
77+
78+
return ruleComponents.joined(separator: "\n")
79+
}
80+
}
Lines changed: 104 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,104 @@
1+
import Foundation
2+
3+
/// Represents a rule in a robots.txt file.
4+
///
5+
/// Used to define instructions for web crawlers about which parts of the website should be crawled.
6+
/// Each rule specifies which user agents (crawlers) it applies to and what paths they can access.
7+
/// For more information about the robots.txt standard, see: https://developers.google.com/search/docs/crawling-indexing/robots/intro
8+
public struct RobotsRule: Equatable, Hashable {
9+
/// The user agent the rule applies to (e.g., "Googlebot" or "*" for all crawlers).
10+
public let userAgent: String
11+
12+
/// Paths that should not be crawled.
13+
public let disallow: [String]?
14+
15+
/// Paths that are allowed to be crawled (overrides disallow rules).
16+
public let allow: [String]?
17+
18+
/// The delay between successive crawls in seconds.
19+
public let crawlDelay: Int?
20+
21+
/// Creates a new robots.txt rule.
22+
///
23+
/// - Parameters:
24+
/// - userAgent: The user agent the rule applies to (e.g., "Googlebot" or "*" for all crawlers).
25+
/// - disallow: Paths that should not be crawled.
26+
/// - allow: Paths that are allowed to be crawled (overrides disallow rules).
27+
/// - crawlDelay: The delay between successive crawls in seconds.
28+
///
29+
/// - Example:
30+
/// ```swift
31+
/// let rule = RobotsRule(
32+
/// userAgent: "*",
33+
/// disallow: ["/admin/", "/private/"],
34+
/// allow: ["/public/"],
35+
/// crawlDelay: 10
36+
/// )
37+
/// ```
38+
public init(
39+
userAgent: String,
40+
disallow: [String]? = nil,
41+
allow: [String]? = nil,
42+
crawlDelay: Int? = nil
43+
) {
44+
self.userAgent = userAgent
45+
self.disallow = disallow
46+
self.allow = allow
47+
self.crawlDelay = crawlDelay
48+
}
49+
50+
/// Creates a rule that allows all crawlers to access the entire site.
51+
///
52+
/// - Returns: A rule that allows all paths for all user agents.
53+
///
54+
/// - Example:
55+
/// ```swift
56+
/// let allowAllRule = RobotsRule.allowAll()
57+
/// ```
58+
public static func allowAll() -> RobotsRule {
59+
RobotsRule(userAgent: "*", allow: ["/"])
60+
}
61+
62+
/// Creates a rule that disallows all crawlers from accessing the entire site.
63+
///
64+
/// - Returns: A rule that disallows all paths for all user agents.
65+
///
66+
/// - Example:
67+
/// ```swift
68+
/// let disallowAllRule = RobotsRule.disallowAll()
69+
/// ```
70+
public static func disallowAll() -> RobotsRule {
71+
RobotsRule(userAgent: "*", disallow: ["/"])
72+
}
73+
74+
/// Creates a rule for a specific crawler with custom access permissions.
75+
///
76+
/// - Parameters:
77+
/// - agent: The specific crawler user agent (e.g., "Googlebot").
78+
/// - disallow: Paths that should not be crawled.
79+
/// - allow: Paths that are allowed to be crawled.
80+
/// - crawlDelay: The delay between successive crawls in seconds.
81+
/// - Returns: A rule configured for the specified crawler.
82+
///
83+
/// - Example:
84+
/// ```swift
85+
/// let googleRule = RobotsRule.forAgent(
86+
/// "Googlebot",
87+
/// disallow: ["/private/"],
88+
/// allow: ["/public/"]
89+
/// )
90+
/// ```
91+
public static func forAgent(
92+
_ agent: String,
93+
disallow: [String]? = nil,
94+
allow: [String]? = nil,
95+
crawlDelay: Int? = nil
96+
) -> RobotsRule {
97+
RobotsRule(
98+
userAgent: agent,
99+
disallow: disallow,
100+
allow: allow,
101+
crawlDelay: crawlDelay
102+
)
103+
}
104+
}

0 commit comments

Comments
 (0)