Skip to content

Refactor Directories for Readability and Maintainability and Documentation Updated for Responsive Design Update #59

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
May 23, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 20 additions & 27 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,10 +40,9 @@ This project and everyone participating in it is governed by the [WebUI Code of
### Pull Requests

1. Fork the repository
2. Create your feature branch from the `development` branch, (`git checkout -b feature/amazing-feature`)
3. Commit your changes following conventional commit messages (`git commit -m 'feat: add some amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request to the `development` branch.
2. Create your feature branch from the `development` branch, (`git checkout -b feature-00-amazing-feature`)
3. Push to the branch (`git push origin feature-00-amazing-feature`)
4. Open a Pull Request to the `development` branch.

## Development Process

Expand All @@ -66,9 +65,9 @@ WebUI maintains three primary branches:

Version bumps are triggered automatically via commit messages. Use the following prefixes:

- `feat!:` - Major version increment for breaking changes (e.g., `1.0.0` → `2.0.0`).
- `feat:` - Minor version increment for new features (e.g., `1.0.0` → `1.1.0`).
- `fix:` or `fix!:` - Patch version increment for bug fixes (e.g., `1.0.0` → `1.0.1`).
- `feat!:` or `Release` - Major version increment for breaking changes (e.g., `1.0.0` → `2.0.0`).
- `feat:` or `Feature` - Minor version increment for new features (e.g., `1.0.0` → `1.1.0`).
- `fix:`, `fix!:` or `Fixes` - Patch version increment for bug fixes (e.g., `1.0.0` → `1.0.1`).

### Release Process

Expand All @@ -79,23 +78,11 @@ The release workflow is fully automated:
3. Release notes are automatically generated from the commit messages.
4. A new tag is created and the release is published on GitHub.

### Quick Fixes (Hotfixes)

For urgent fixes that need to be pushed to `main` right away:

1. Create a PR targeting the `main` branch.
2. Include `fix!:` in the PR title and merge message.
3. Once approved and merged, an action will automatically create PRs for `next` and `development` branches.
4. This ensures all branches remain in sync when quick changes are required in the main branch.

> [!WARNING]
> Ensure the auto-generated PRs are approved and merged promptly to maintain branch synchronization.

### Testing

Automated tests run on every pull request to `main` and `next` branches:

1. Tests are executed in a macOS environment with Swift 6.1.
1. Tests are executed in a macOS environment.
2. The workflow includes caching of Swift Package Manager dependencies for faster builds.
3. All tests must pass before a PR can be merged.

Expand All @@ -116,6 +103,9 @@ WebUI uses Swift DocC for documentation:
swift package --disable-sandbox preview-documentation
```

> [!NOTE]
> You can also run `Build Documentation` inside of Xcode to view the documentation in

### Adding New Elements

WebUI follows a compositional pattern for creating HTML elements. When adding a new element, adhere to these guidelines:
Expand All @@ -139,12 +129,12 @@ WebUI follows a compositional pattern for creating HTML elements. When adding a
/// Defines the types available for the element.
///
/// Detailed documentation about the enum and its purpose.
public enum ElementType: String {
public enum ElementCustom: String {
/// Documentation for this case.
case primary
case one

/// Documentation for this case.
case secondary
case two
}
```

Expand All @@ -155,12 +145,12 @@ WebUI follows a compositional pattern for creating HTML elements. When adding a
/// Detailed documentation about what this element represents and its use cases.
public final class ElementName: Element {
// Properties specific to this element
let type: ElementType?
let customType: ElementCustom?

/// Creates a new HTML element_name.
///
/// - Parameters:
/// - type: Type of the element, optional.
/// - custom: An example custom attribute, optional.
/// - id: Unique identifier, optional.
/// - classes: CSS class names, optional.
/// - role: ARIA role for accessibility, optional.
Expand All @@ -187,7 +177,7 @@ WebUI follows a compositional pattern for creating HTML elements. When adding a

// Build custom attributes using Attr namespace
let customAttributes = [
Attribute.typed("type", type)
Attribute.typed("custom", custom) // will generate as `custom="\(custom)"`
].compactMap { $0 }

// Initialize the parent Element class
Expand All @@ -205,14 +195,17 @@ WebUI follows a compositional pattern for creating HTML elements. When adding a
}
```

4. **Testing**: Add unit tests for the new element in the `Tests` directory.
4. **Testing**: Add unit tests for the new element in the `Tests/Styles` directory.

5. **Documentation**: Include comprehensive DocC documentation with:
- Class-level documentation explaining the element's purpose
- Parameter documentation for all initializer parameters
- Usage examples showing common implementations
- Mention any accessibility considerations

> [!IMPORTANT]
> Pull requests with new elements, modifiers and utilities will be rejected or put on hold until adequate documentation is provided. This is extemely important for both the end user of the library to understand what each element does and means semantically as well as ensuring maintainability for the maintainers of the project.

## Adding New Style Modifiers

Style modifiers in WebUI follow the unified style system pattern. Here's how to add a new style modifier:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -142,6 +142,8 @@ public struct Metadata {
/// - locale: Override for the language locale.
/// - type: Override for the content type.
/// - themeColor: Override for the theme color.
/// - favicons: Override for the favicons.
/// - structuredData: Override for the structured data.
///
/// - Example:
/// ```swift
Expand Down
80 changes: 80 additions & 0 deletions Sources/WebUI/Core/Infrastructure/Robots/GenerateRobots.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
import Foundation

/// Provides functionality for generating robots.txt files.
///
/// The `Robots` struct offers methods for creating standards-compliant robots.txt files
/// that provide instructions to web crawlers about which parts of a website they can access.
public struct Robots {

/// Generates a robots.txt file content.
///
/// This method creates a standard robots.txt file that includes instructions for web crawlers,
/// including a reference to the sitemap if one exists.
///
/// - Parameters:
/// - baseURL: The optional base URL of the website (e.g., "https://example.com").
/// - generateSitemap: Whether a sitemap is being generated for this website.
/// - robotsRules: Custom rules to include in the robots.txt file.
/// - Returns: A string containing the content of the robots.txt file.
///
/// - Example:
/// ```swift
/// let content = Robots.generateTxt(
/// baseURL: "https://example.com",
/// generateSitemap: true,
/// robotsRules: [.allowAll()]
/// )
/// ```
///
/// - Note: If custom rules are provided, they will be included in the file.
/// Otherwise, a default permissive robots.txt will be generated.
public static func generateTxt(
baseURL: String? = nil,
generateSitemap: Bool = false,
robotsRules: [RobotsRule]? = nil
) -> String {
var contentComponents = ["# robots.txt generated by WebUI\n"]

if let rules = robotsRules, !rules.isEmpty {
// Add each custom rule
for rule in rules {
contentComponents.append(formatRule(rule))
}
} else {
// Default permissive robots.txt
contentComponents.append("User-agent: *\nAllow: /\n")
}

// Add sitemap reference if applicable
if generateSitemap, let baseURL = baseURL {
contentComponents.append("Sitemap: \(baseURL)/sitemap.xml")
}

return contentComponents.joined(separator: "\n")
}

/// Formats a single robots rule as a string.
///
/// - Parameter rule: The robots rule to format.
/// - Returns: A string representation of the rule.
private static func formatRule(_ rule: RobotsRule) -> String {
var ruleComponents = ["User-agent: \(rule.userAgent)"]

// Add disallow paths
if let disallow = rule.disallow, !disallow.isEmpty {
ruleComponents.append(contentsOf: disallow.map { "Disallow: \($0)" })
}

// Add allow paths
if let allow = rule.allow, !allow.isEmpty {
ruleComponents.append(contentsOf: allow.map { "Allow: \($0)" })
}

// Add crawl delay if provided
if let crawlDelay = rule.crawlDelay {
ruleComponents.append("Crawl-delay: \(crawlDelay)")
}

return ruleComponents.joined(separator: "\n")
}
}
104 changes: 104 additions & 0 deletions Sources/WebUI/Core/Infrastructure/Robots/RobotsRule.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
import Foundation

/// Represents a rule in a robots.txt file.
///
/// Used to define instructions for web crawlers about which parts of the website should be crawled.
/// Each rule specifies which user agents (crawlers) it applies to and what paths they can access.
/// For more information about the robots.txt standard, see: https://developers.google.com/search/docs/crawling-indexing/robots/intro
public struct RobotsRule: Equatable, Hashable {
/// The user agent the rule applies to (e.g., "Googlebot" or "*" for all crawlers).
public let userAgent: String

/// Paths that should not be crawled.
public let disallow: [String]?

/// Paths that are allowed to be crawled (overrides disallow rules).
public let allow: [String]?

/// The delay between successive crawls in seconds.
public let crawlDelay: Int?

/// Creates a new robots.txt rule.
///
/// - Parameters:
/// - userAgent: The user agent the rule applies to (e.g., "Googlebot" or "*" for all crawlers).
/// - disallow: Paths that should not be crawled.
/// - allow: Paths that are allowed to be crawled (overrides disallow rules).
/// - crawlDelay: The delay between successive crawls in seconds.
///
/// - Example:
/// ```swift
/// let rule = RobotsRule(
/// userAgent: "*",
/// disallow: ["/admin/", "/private/"],
/// allow: ["/public/"],
/// crawlDelay: 10
/// )
/// ```
public init(
userAgent: String,
disallow: [String]? = nil,
allow: [String]? = nil,
crawlDelay: Int? = nil
) {
self.userAgent = userAgent
self.disallow = disallow
self.allow = allow
self.crawlDelay = crawlDelay
}

/// Creates a rule that allows all crawlers to access the entire site.
///
/// - Returns: A rule that allows all paths for all user agents.
///
/// - Example:
/// ```swift
/// let allowAllRule = RobotsRule.allowAll()
/// ```
public static func allowAll() -> RobotsRule {
RobotsRule(userAgent: "*", allow: ["/"])
}

/// Creates a rule that disallows all crawlers from accessing the entire site.
///
/// - Returns: A rule that disallows all paths for all user agents.
///
/// - Example:
/// ```swift
/// let disallowAllRule = RobotsRule.disallowAll()
/// ```
public static func disallowAll() -> RobotsRule {
RobotsRule(userAgent: "*", disallow: ["/"])
}

/// Creates a rule for a specific crawler with custom access permissions.
///
/// - Parameters:
/// - agent: The specific crawler user agent (e.g., "Googlebot").
/// - disallow: Paths that should not be crawled.
/// - allow: Paths that are allowed to be crawled.
/// - crawlDelay: The delay between successive crawls in seconds.
/// - Returns: A rule configured for the specified crawler.
///
/// - Example:
/// ```swift
/// let googleRule = RobotsRule.forAgent(
/// "Googlebot",
/// disallow: ["/private/"],
/// allow: ["/public/"]
/// )
/// ```
public static func forAgent(
_ agent: String,
disallow: [String]? = nil,
allow: [String]? = nil,
crawlDelay: Int? = nil
) -> RobotsRule {
RobotsRule(
userAgent: agent,
disallow: disallow,
allow: allow,
crawlDelay: crawlDelay
)
}
}
Loading