1. Fabric8 Launcher development

This chapter contains information about developing your own runtimes or boosters, contributing changes to existing ones, and improving Fabric8 Launcher.

1.1. Updating booster catalog for local Single-node OpenShift Cluster testing

One of important life cycle stages of a booster is testing it in a booster catalog. Create a custom booster catalog that links to the repository with your booster on GitHub, and test your booster by following the instructions below:

1.1.1. Creating custom booster catalog

By default, the Fabric8 Launcher tool uses a booster catalog from the openshiftio/booster-catalog repository. If you do not want to use the default catalog, you can create one from scratch, or fork the default catalog and make changes to it. This chapter describes the procedure for modifying the default catalog.

Prerequisites
  • A GitHub account

Creating Custom Booster Catalog
  1. Navigate to the openshiftio/booster-catalog repository, which hosts the default booster catalog.

  2. Fork the repository by clicking Fork in the top right corner of the page.

  3. Clone the forked repository locally to your hard drive.

  4. Make changes to the catalog, for example:

    • Modify the existing booster listing by modifying the YAML files in the mission and runtime directories.

    • Create a new listing by creating a new YAML file with the following structure in one of the runtime directories:

      Example 1. Catalog YAML File
      githubRepo: USER/REPO
      gitRef: BRANCH

      In the example above, USER is the GitHub user name or organization name and REPO is the name of the repository where the booster is located. BRANCH is the branch in the booster repository you want to make available in the catalog.

    • Modify the metadata.json file where additional information about the boosters are stored.

  5. Commit and push your changes to your forked repository.

1.1.2. Installing Fabric8 Launcher tool manually

Install a local customized instance of the Fabric8 Launcher tool, which allows you to test the functionality or make modifications to the service using a web interface.

Prerequisites
Procedure
  1. Open the Single-node OpenShift Cluster Web console and log in.

  2. Click New Project to create a new OpenShift project to house the Fabric8 Launcher tool.

    New Project Button
  3. Name the project and optionally provide a description. This example uses my-launcher for the project’s name.

    New Project Config
  4. Click Create to complete the project creation.

  5. Click Import YAML/JSON to add services to your new project from a template.

    Import YAML/JSON
  6. Copy the contents of the current Fabric8 Launcher template from the GitHub repository and paste it into the text box provided.

  7. Click Create, ensure that only the Process the template option is selected, and click Continue.

    Process Template
  8. Fill out the following fields.

    • Your GitHub username.

    • Your GitHub Mission Control access token is your personal access token for GitHub.

    • The Target OpenShift Console URL is the OpenShift Console URL from your Single-node OpenShift Cluster. This should be the same base URL you are currently using to complete the form, for example https://192.168.42.152:8443.

    • OpenShift username and password from your Single-node OpenShift Cluster, for example developer for the username and password.

    • KeyCloak URL and KeyCloak Realm MUST be cleared out.

      You must clear these fields out for the Fabric8 Launcher tool on your Single-node OpenShift Cluster to be configured correctly.
    • Set Catalog Git Repository to the repository with the catalog that you are testing. Set Catalog Git Reference to the branch in that repository you are testing.

  9. Before proceeding to the next steps, confirm all the fields are correct. Also confirm that KeyCloak URL and KeyCloak Realm have been cleared out.

  10. Click Create to complete the setup. You will see a screen confirming that the service has been created. Click Continue to overview.

  11. On the overview page, wait and confirm that the four pods for the Fabric8 Launcher tool have completed starting up.

    Fabric8 Launcher booting
  12. When all pods are running, click the link at the top of all pods, which typically ends in nip.io.

    A new browser tab opens with the Fabric8 Launcher tool. This is the same service as https://developers.redhat.com/launch but running in a Single-node OpenShift Cluster.

  13. Start using your Fabric8 Launcher tool to launch booster applications.

Additional resources

You can preview the latest state of Fabric8 Launcher by navigating to the stage build of Fabric8 Launcher.

2. Filing a code issue

TBD

3. Documentation

This chapter contains information about contributing and releasing the launcher.fabric8.io documentation. You can also contribute to the documentation by filing an issue in our Jira project with corrections or feedback.

3.1. Before you start

To contribute to the Fabric8 Launcher documentation, you need to configure the following tools and accounts:

3.1.1. Tools (required)

You must install the following tools on your system:

Asciidoctor

A quick and light tool for local builds that allows you to check factual correctness or flow of information.

To install Asciidoctor on Fedora, CentOS, or RHEL, run the following command (in Fedora, replace yum with dnf):

# yum install asciidoctor

On Windows, Mac OS X, and other Linux distributions, follow the instructions in the official Asciidoctor documentation.

Meld (optional)

A graphical tool for comparing files. This tool is useful for comparing Git revisions and resolving merge conflicts.

To install Meld on Fedora, CentOS, or RHEL, run the following command (in Fedora, replace yum with dnf):

# yum install meld

3.1.2. Accounts (required)

To contribute, you must have a GitHub account with GPG configured. To configure GPG:

On RHEL, CentOS, or Fedora, use the gpg2 binary everywhere instead of gpg. On MacOS, use gpg.
Configuring GPG
  1. Generate a new GPG key.

  2. Add the GPG key to your GitHub account.

  3. Configure GPG with git on your machine.

    On RHEL, CentOS, or Fedora, ensure to also add gpg2 as the signing program according to the linked instructions.
  4. In the launcher-documentation repository, set automatic signing with your GPG key:

    $ git config commit.gpgsign true

    For more information, see Signing commits using GPG.

  5. On RHEL, CentOS, or Fedora, set the commit signing program to gpg2. On MacOS, set the commit signing program to gpg.

    $ git config --global gpg.program gpg2

To make a contribution to the launcher-documentation repository, you need to set up the Git hooks directory. These hooks automatically make sure at commit time that your contribution does not break important parts of the repository.

Setting up Git Hooks in launcher-documentation Repository
  1. In a terminal application, navigate to the directory with the launcher-documentation repository.

  2. Set the Git hook directory to $REPO_HOME/.githooks:

    $ git config core.hooksPath .githooks

3.2. Repository

3.2.1. Repository location

The repository for the mission and booster documentation, the Launchpad front page, and this contributor guide is hosted on GitHub in the launcher-documentation repository.

The runtime documentation repositories are hosted on the following locations:

3.2.2. Repository filesystem layout

The following diagram describes the filesystem layout of the launcher-documentation repository:

The triple-dot indicates there are more files or directories in the particular directory.
Filesystem Layout
.
├── docs/ (1)
│   ├── topics/ (2)
│   │   ├── images/ (3)
│   │   ├── styles/ (4)
│   │   ├── templates/ (5)
│   │   │   ├── document-attributes.adoc (6)
│   │   │   ├── ...
│   │   │   ├── thorntail (7)
│   │   ├── circuit-breaker-mission-design-tradeoffs.adoc
│   │   ├── ...
│   ├── $GUIDE_NAME/ (8)
│   │   ├── master.adoc (9)
│   │   ├── build_guide.sh (10)
│   │   └── topics/ -> ../topics (11)
│   ├── ...
├── scripts/ (12)
│   ├── build_guides.sh (13)
│   ├── validate_guides.sh (14)
│   ├── ...
├── cico_build_deploy (15)
├── CHANGELOG.adoc (16)
└── README.adoc
1 The directory with the sources of all the launcher.fabric8.io guides, the launchpad.openshift.io page, and the contributor guide.
2 The directory with the actual sources in AsciiDoc files. This directory is shared among all guides.
3 The directory with all the images used in the sources.
4 The directory with all the stylesheets used in the sources.
5 The directory with all the templates used in the sources.
6 The file where all the common document attributes are defined.
7 The directory with sources synchronized from the Thorntail repository.
8 The directory with the sources of a guide. Each guide has exactly one directory like this.
9 The main AsciiDoc file of the guide. The files from the topics directory are included from this file.
10 The script for building the particular guide.
11 A symbolic link to the shared $REPO_HOME/docs/topics/ directory.
12 The directory with scripts used for manipulating files in the directory, building, and more.
13 The script for building all guides at once. This script does not validate the guides.
14 The script for validating all guides at once.
15 The script to start local server and to publish to production. For more information, see Starting preview server.
16 The file with changes made in each release.
Do not edit files in the $REPO_HOME/docs/topics/thorntail directory, always submit a pull request to the Thorntail repository and synchronize the files afterwards.
Additional resources

3.2.3. Branches and tags

The following branches and tags are used in the launcher-documentation repository:

Table 1. Branch Names and Contents
Branch Content

master

The latest version of the documentation in development.

Table 2. Tag Format and Contents
Branch Content

Date in the YYYY-MM-DD format

The documentation as released on the particular date.

Date in the YYYY-MM-DD_$NUMBER format

A subsequent release of the documentation on that date.

3.3. Contributing

3.3.1. Git workflow for making changes

When making contributions, follow the standard GitHub flow. Below, you will find more details about the steps that are required in the launcher-documentation repository.

Do not edit files in the $REPO_HOME/docs/topics/thorntail directory, always submit a pull request to the Thorntail repository and synchronize the files afterwards.
Prerequisites
Contributor Workflow
  1. Create a topic branch in your personal fork of the launcher-documentation repository. If you want to resolve a particular issue with your changes, name the branch issue-$ID, where $ID is the numerical ID of the issue.

  2. Commit and push your changes to the topic branch. Verify your changes:

    • Preview the build locally. For more information, see Building locally.

    • Optionally validate the books. This step is done automatically by a commit hook at commit time, but you can validate the books manually at any time. Execute the following script:

      $ ./scripts/validate_guides.sh

      If there are errors in validation, open the master.xml file in the directory of the book that failed to validate. The validation output should tell you where the issue is. Modify the master.adoc or another included *.adoc file accordingly.

      If you are unsure, ask someone from the team for help.

  3. Create a pull request against the branch you based your contribution off. Ensure you also:

    1. Label the pull request appropriately. For more information about labels, see Using GitHub labels.

    2. Reference the issue that your pull request resolves in the description, using the formulation resolves $ID, where $ID is the numerical ID of the issue. Doing this automatically closes the issue when the pull request is accepted.

    A team member (a reviewer) will review your changes for factual and stylistic correctness. If the changes are acceptable, the reviewer merges them and closes the pull request. The reviewer may ask you to make further modifications if necessary, or file a pull request against your branch with their suggested modifications.

    Do not worry that your pull request was closed, the changes are merged manually as opposed to the GitHub web UI.

3.3.2. Modifying existing content

To modify an existing guide, you must understand how it is structured as well as the basic workflow for modifying it.

Guide structure

Each guide is stored in a separate directory. The main file for the guide is $REPO_HOME/docs/GUIDE_NAME/master.adoc.

Example master.adoc file
include::topics/templates/document-attributes.adoc[] (1)
//var for front-end topics, if below is defined in topic, its used in docs, if not its used in the front end (2)
:docs-topic: (3)

= {contrib-guide-name} (4)


== {launcher} Development

This chapter contains information about developing your own runtimes or boosters, contributing changes to existing ones, and improving {launcher}.

include::topics/assembly_updating-booster-catalog-for-local-openshiftlocal-testing.adoc[leveloffset=+2] (5)

You can preview the latest state of {launcher} by navigating to the link:{launcher-stage}[stage build] of {launcher}.

== Filing a code issue
...
1 Includes a file that sets common attributes across all guides.
2 A comment that will not be rendered in the final output.
3 Defines a new attribute called docs-topic
4 Sets the document title, in this case it uses the existing attribute contrib-guide-name. This attribute is defined in $REPO_HOME/docs/topics/templates/document-attributes.adoc.
5 Adds the content from $REPO_HOME/docs/topics/assembly-updating-booster-catalog-for-local-minishift-testing.adoc and adds a level offset of two, which means heading levels are offset by two. For example, first-level headings in $REPO_HOME/docs/topics/assembly-updating-booster-catalog-for-local-minishift-testing.adoc are rendered as third-level headings.

There are a few important concepts that are heavily used that you need to learn:

document-attributes file

$REPO_HOME/docs/topics/templates/document-attributes.adoc sets common variables that are used across all the guides. Every guide includes this file.

partitioning content

Sections of content in each guide are separated into smaller files. This allows for content to be easily refactored within a guide as well as reused across different guides. The include directive is used to include these files in a piece of content.

modular documentation

An individual piece of content, called a module, is formatted in a specific way, which allows for better reuse across guides.

common topics directory

All of the smaller content files and modules are located in the $REPO_HOME/docs/topics/ directory. This is a common directory that contains all shared files across all guides. To help the build process, the $REPO_HOME/docs/topics/ directory is symlinked in each guide’s directory.

Basic workflow for modifying existing guides
Prerequisites
Procedure
  1. Find the guide you want to modify and open the master.adoc file in the director of the guide.

  2. Find the section that you want to modify and open the appropriate file.

    Many text editors have a way to quickly open a file by name. For example, the Atom editor has a nice Find File command that allows you to quickly search by filename and path.
  3. Make the changes.

  4. Verify the changes by doing a local build and previewing the result.

  5. Submit your changes according to the instructions in Git workflow for making changes.

3.3.3. Adding new guide

To add a new guide to the existing documentation set, perform the following steps:

Adding New Guide
  1. Create a new directory for the guide under docs/, for example docs/my-new-guide.

  2. Create basic infrastructure in the directory of the new guide:

    • Create a master.adoc file.

    • Create a symbolic link called topics that points to ../topics:

      $ ln -s ../topics topics
    • Copy the build_guide.sh script from another guide’s folder. Modify the script so that the GUIDE_HTML_NAME variable is set to the directory name of the new book, for example:

      GUIDE_HTML_NAME=my-new-guide
  3. Add the default content to the master.adoc file:

    include::topics/templates/document-attributes.adoc[]
    
    :my-new-guide:
    
    = {my-new-guide-guide-name}
  4. Add the new guide name to the docs/topics/templates/document-attributes.adoc file:

    :my-new-guide-guide-name: My New Guide
  5. Ensure the new guide builds correctly by executing the scripts/build_guides.sh script.

  6. Open the generated HTML file to ensure everything is rendered correctly:

    $ firefox html/my-new-guide.html

    In the command above, replace firefox with the browser of your choice.

  7. Write the content of the guide. When you are ready for some initial review and feedback, file a pull request to the master branch in the launcher-documentation repository.

3.3.4. Building locally

When you are making a contribution, you should preview your changes before you commit or push them. This section describes the most typical procedure of doing so.

Prerequisites
  • Asciidoctor

Building a Single Guide
  1. In a console application, navigate to the directory of the book you want to build.

    Example:

    $ cd $REPO_HOME/docs/contribution-guide/

    Replace $REPO_HOME with the real path to the repository.

  2. Execute the build_guide.sh script:

    $ ./build_guide.sh
  3. View the resulting document in the $REPO_HOME/html directory.

    Example:

    $ firefox $REPO_HOME/html/docs/contribution-guide.html

    Replace $REPO_HOME with the real path to the repository, and firefox with the name of your preferred browser.

Building All Guides
  1. In a console application, navigate to the $REPO_HOME/scripts directory.

  2. Build all books by executing the build_guides.sh script:

    $ ./build_guides.sh
  3. View the resulting documents in the $REPO_HOME/html directory:

    $ firefox $REPO_HOME/html/docs/*.html

    Replace $REPO_HOME with the real path to the repository.

3.3.5. Starting preview server

You can preview the documentation site locally by building and deploying the Docker containers with the documentation site on your machine.

Prerequisites
  • Docker installed.

  • Docker daemon running.

Running Preview Server Locally
  1. Execute the scripts/preview_server.sh script. If you execute it as a regular user, you will be asked for administrator privileges because they are required for operating the docker binary.

    $ scripts/preview_server.sh
    We do not recommend adding the regular user to the docker group. For more information, see the Related Information section below.
  2. Navigate to http://localhost in your browser.

  3. When you are done, stop the Docker container with the server:

    1. Go back to the terminal where the script is running.

    2. Press Ctrl + C.

Additional resources

3.3.6. Validating guides

To ensure that all books can be built, validate them using the automatic script provided in the repository.

Validating all guides

The validation is performed on DocBook XML files generated from the AsciiDoc sources because it is very dificult to perform validations on AsciiDoc sources.

To perform the validation, execute the scripts/validate_guides.sh script:

$ scripts/validate_guides.sh
Excluding books from validation

In case a failed validation is an expected behavior—​for example if there is a bug in the XML generation—​you can remove the book from validation by adding a file named .ci-ignore to the main directory of the book, for example:

$ touch docs/thorntail-runtime/.ci-ignore

To remove the book from validation permanently, commit this file to Git, for example:

$ git add docs/thorntail-runtime/.ci-ignore
$ git commit docs/thorntail-runtime/.ci-ignore -m "Removed Thorntail Guide from validation"

To re-enable validation of the book, remove the .ci-ignore file, for example:

$ rm docs/thorntail-runtime/.ci-ignore

3.4. Requirements

This chapter contains the requirements your contributions must meet in order to be accepted.

3.4.1. Writing style

Follow the rules of technical writing as described in the following guides; your pull requests will be evaluated against the requirements described in them. If you do not have access to some of them, use the guides that are available to you:

IBM Style Guide

A general guide to writing technical documentation. Available only in print or in commercial digital copies.

Red Hat Stylepedia

A freely-available guide partially overlapping with the IBM Style Guide. Contains information about writing Red Hat–specific concepts.

Red Hat AsciiDoc Conventions Samples Guide

A freely-available guide about writing particular elements in the AsciiDoc source code, for example GUI buttons, external links, or command blocks. It is important you follow these rules for consistency with other parts of the documentation.

Minimalism in Red Hat Documentation

A Red Hat internal document about applying minimalism to documentation.

Writing rules specific to this repository

When writing new content, adhere to the following rules:

  • When using the en dash, use the {ndash} attribute.

    Do not use:

    • – because it breaks validation.

    • – even though {ndash} is currently set to it, because the attribute can change in the future.

3.4.2. Writing the modular way

When authoring new content, do not write a monolithic piece of text, but write the information is smaller, self-contained pieces called modules. The modules used in this documentation use the templates described in the Red Hat Modular Documentation Reference Guide.

You will be asked to modify new content that you submit if it does not follow the correct templates.

3.5. Administration

3.5.1. Reviewing changes

Ensure you perform all the following actions when reviewing a pull request:

Prerequisites
  • GitHub account with administration privileges to Jenkins builds. For more information, see Additional resources in this chapter.

Reviewing Pull Request
  1. Approve the test build if there is a comment from the centos-ci user asking "Can one of the admins verify this patch?". That means the pull request comes from an external contributor. To approve the build, reply [test] in comments.

  2. Check if the test build passed.

    If it failed, review the build log by clicking Show all checksDetails in the box at the bottom of the comment section. In the page that opens, click Console Output. Search for the string Running tests…​ and diagnose the problem.

  3. Click the Files Changed tab and review the changes. Pay special attention to the following matters:

    • Correct use of attributes.

    • Correct file placement in the repository layout.

    • Correct naming of products.

    If you are unsure if the changes render correctly, build the contribution locally:

    1. Add the contributor’s repository as a new remote and fetch it.

    2. Checkout the source branch of the pull request. You can find it below the pull request title in the from field.

    3. Run the preview server locally and review the built documentation.

3.5.2. Merging contributions

When you or someone else has reviewed the changes in a pull request, you can merge them into upstream/master.

Do not merge your own pull requests unless absolutely necessary. You are accountable for any errors, merge conflicts, or bugs resulting from incorrect merging. Let another contributor with the appropriate repository access privileges do it.
Prerequisites
  • GitHub account with merge access to the launcher-documentation repository and GPG configured.

Merging a Pull Request
  1. Ensure that you have the contributor’s fork of the launcher-documentation repository in your list of remotes:

    $ git remote -v
  2. To add a remote fork for to you Git directory:

    1. Navigate to the contributor’s fork on GitHub.

    2. Click the Clone or Download button.

    3. Select Use SSH.

    4. Copy the repository URL beginning with [email protected]:.

    5. Execute the following command, replacing username with the user name of the contributor:

      $ git remote add username [email protected]:username/launcher-documentation.git
    6. Use git remote -v to verify that the fork is now listed among your remotes.

    7. Execute git remote update to fetch the latest state of the remotes to your working directory.

  3. Create a new branch and set it to track the remote branch that introduces the changes in the pull request:

    $ git checkout -b pr-to-merge username/featurebranch

    You should see the following message:

    Branch pr-to-merge set up to track remote branch <topic_branch> from <username> by rebasing.
    Switched to a new branch 'pr-to-merge'
  4. Fetch the latest state of upstream and rebase your topic branch:

    $ git fetch upstream
    $ git rebase upstream/master

    If you have multiple commits on your feature branch, squash them into a single commit before merging.

  5. Test if all the books build correctly with your changes by executing the $REPO_HOME/scripts/build_guides.sh script. If the output contains no errors, the guides build correctly.

  6. Checkout your local master branch:

    $ git checkout master
  7. Rebase your local master branch against upstream/master:

    $ git rebase upstream/master
  8. Merge your topic branch into master:

    $ git merge pr-to-merge
  9. Execute $REPO_HOME/scripts/build_guides.sh to test whether the guides build correctly with the changes introduced by your merge.

  10. Push the latest changes into upstream:

    $ git push upstream master

    The changes are now merged into upstream/master.

  11. Close the pull request for the changes you just merged:

    1. Obtain the commit hash of the last commit on master:

      $ git log -1 HEAD
    2. Navigate to the pull request on GitHub.

    3. Paste the commit following line into a new comment, replacing <commit-hash> with the SHA-1 of your commit:

      merged: <commit-hash>
    4. Click Close and comment.

3.5.3. Syncing with Thorntail docs

Some documentation files for the Thorntail runtime are sourced directly from the Thorntail repository. These files are stored in the $REPO_HOME/docs/topics/thorntail directory. Do not edit these files directly; always submit a pull request to the Thorntail repository. When the pull request is accepted and synchronized to the Thorntail product repository, follow the procedure below.

The synchronization script deletes the existing $REPO_HOME/docs/topics/thorntail directory and replaces it with the latest version of the master branch from the upstream repository. Edit the variables in the $REPO_HOME/scripts/sync-with-thorntail/sync.sh script to customize it. The variables are documented in the script.

Some files are not present in the upstream repository because they are generated. The script automatically builds the upstream project with Maven, ensuring these files are not missing.

The hash of the synced upstream commit is stored in the docs/topics/thorntail/docs/commit.hash file.

Prerequisites
  • Maven installed

Procedure
  1. Execute the $REPO_HOME/scripts/sync-with-thorntail/sync.sh script:

    $ ./scripts/sync-with-thorntail/sync.sh

3.5.4. Releasing new version

This section contains all information you need to release a new version of the documentation set to production. This update can happen anytime after the catalog has been updated, including after the release train has been completed.

Releasing New Version to Production
  1. Ensure the catalog has been updated. Contact the development team for this information.

  2. Synchronize the Thorntail sources. For more information, see Syncing with Thorntail docs.

  3. Tag the release:

    1. Execute the tag_release.sh script in the $REPO_HOME/scripts directory:

      $ ./tag_release.sh

      The script automatically tags the commit with the current date in the YYYY-MM-DD format. If you want to tag with a different date or manually, execute the following command:

      $ git tag 2017-04-21

      If you are re-releasing the same day, a suffix _2, _3, etc. is appended.

    2. Push the changes and tags:

      $ git push --tags $REMOTE

      Replace $REMOTE with the name of the upstream remote.

  4. Request publication:

    1. File a pull request in the openshiftio/saas-launchpad repository, where you change the hash value to the hash of the commit you want to publish from the launcher-documentation repository. Usually, this will be the latest commit in the master branch.

    2. Wait for the pull request to be accepted. When that happens, verify that the production build succeeded.

    3. Once your changes have been merged and the build succeeds, delete the topic branch you used to introduce the update.

      $ git branch -d $TOPIC_BRANCH_NAME

3.6. Issues

3.6.1. GitHub issue workflow

Each issue in the launcher-documentation repository goes through the following life cycle. In each stage, the descriptions list the actions the following people must take:

GitHub Issue Life Cycle
  1. The issue is filed.

    Reporter

    Describe the problem briefly, but in sufficient detail.

    Team Member

    Ensure the issue has correct labels. For more information, see Using GitHub labels.

  2. A team member or a contributor assigns the issue.

    Assignee

    Label the issue In Progress, add it to the Current Development milestone, address the issue. For more information, see Git workflow for making changes.

  3. The assignee files a pull request that addresses the issue.

    Assignee

    Change the issue label from In Progress to SME Review, ask an SME (a subject matter expert) to review the pull request.

    SME

    Provide feedback either in the pull request comments or by filing another pull request in the assignee’s repository.

  4. All comments from the SME are addressed.

    Assignee

    Change the issue label from SME Review to Peer Review. Ask a writer to perform a peer review.

    Writer (Peer)

    Provide feedback either in the pull request comments or by filing another pull request in the assignee’s repository.

  5. If all comments from the peer have been addressed:

    Team Member with Merge Access

    Merge the pull request. If further changes need to be made, file a follow-up issue.

    Assignee

    Close the issue if it has not been closed automatically. Remove the Peer Review label.

3.6.2. Using GitHub labels

In the launcher-documentation repository, labels are an important way of communicating the status of issues and pull requests. Please use them accordingly to help maintain efficient work environment.

There are several groups of labels based on their meaning:

Issue status

Use the following labels to indicate what stage of its life cycle an issue or pull request is in.

Table 3. Issue Statuses
Label Description

Status | In Progress

The assignee is actively working on the issue or is waiting for input from someone. The draft of the issue is not ready.

Status | SME Review

An SME is providing a review of the draft. This review comes before the peer review.

Status | Peer Review

A peer (a writer) is providing a review of the draft. This review comes after the SME review.

Status | Blocked

The issue can not proceed because it is dependent on another issue being fixed first.

Status | Waiting for SME

The issue or pull request is waiting for SME to start review.

Status | Waiting for Peer

The issue or pull request is waiting for a peer to start review.

Issue type

Use the following labels to describe what the content of the issue is.

Table 4. Issue Types
Label Description

Type | Bug

A part of the documentation is factually wrong or outdated, and needs fixing.

Type | Discussion

The issue is likely to cause a lot of questions or a discussion.

Type | Enhancement

Documentation can be improved or its scope can be enlarged.

Issue priority

In addition to the default priority, which is denoted by the lack of a priority label, use the following labels to assign priority to issues:

Table 5. Issue Priority
Label Description

Priority | Blocker

The issue has the highest priority, and development can not continue unless the issue is fixed.

Priority | High

The issue has high priority.

Priority | Low

The issue has low priority.

Issue relations

The following labels describe the relation between issues and pull requests:

Table 6. Issue and Pull Request Relations
Label Description

Issue | Follow-up

The issue is a follow-up to a pull request filed by a non-writer.

Issue | Has PR

A pull request resolving the issue exists and is linked.

Expected effort

In addition to the default expected effort, which is denoted by the lack of an effort label, use the following labels to describe what level of effort you expect an issue or pull request needs to be resolved:

Table 7. Expected Effort to Fix
Label Description

Effort | High

The issue is very large or complex. A lot of time or knowledge is required.

Effort | Low

The issue is simple and does not require much time to be fixed. Suitable for newcomers or for a writer with little time between other issues.

Effort | Easyfix

The issue is trivial, and will take less than a few minutes to be fixed.

Pull request type

Use the following labels to describe how the pull request should be dealth with:

Table 8. Pull Request Types
Label Description

Contributor | Internal

The pull request was submitted by a writer, and should not be accepted until thoroughly proofread and edited.

Contributor | Public

The pull request was submitted by a non-writer contributor. It may be accepted if it meets basic quality criteria, but the person merging the pull request must file a follow-up issue where they describe the necessary changes.

The contribution must be reviewed before merging to ensure the changes are factually correct.
Runtime, Mission, and component labels

Use the following labels to describe what runtimes, mission, or components an issue or pull request concerns. To indicate that an issue is related to a certain booster, combine a runtime and a mission label.

Table 9. Runtime Labels
Label Description

Runtime | Node.js

The issue or pull request relates to the Node.js runtime.

Runtime | Spring Boot

The issue or pull request relates to the Spring Boot runtime.

Runtime | Vert.x

The issue or pull request relates to the Eclipse Vert.x runtime.

Runtime | WildFly Swarm

The issue or pull request relates to the WildFly Swarm runtime.

Table 10. Mission Labels
Label Description

Mission | Circuit Breaker

The issue or pull request relates to the Circuit Breaker Mission.

Mission | CRUD

The issue or pull request relates to the Relational Database Backend Mission.

Mission | ConfigMap

The issue or pull request relates to the ConfigMap Mission.

Mission | Health Check

The issue or pull request relates to the Health Check Mission.

Mission | REST API

The issue or pull request relates to the REST API Level 0 Mission.

Mission | SSO

The issue or pull request relates to the Secured Mission.

Table 11. Component Labels
Label Description

Component | Contrib Guide

The issue or pull request relates to the Contribution Guide.

Component | Getting Started Guide

The issue or pull request relates to the Getting Started Guide.

Component | Minishift Guide

The issue or pull request relates to the Minishift Installation Guide.

Component | Frontend

The issue or pull request relates to the frontend HTMLs etc.

Component | Infrastructure

The issue or pull request relates to publishing, building, CI, etc.

Component | Release Notes

The issue or pull request relates to information to be included in the Release Notes.

Appendix A: Upstream runtime project development resources

TBD

Appendix B: Glossary

B.1. Product and project names

developers.redhat.com/launch

developers.redhat.com/launch is a standalone getting started experience offered by Red Hat for jumpstarting cloud-native application development on OpenShift. It provides a hassle-free way of creating functional example applications, called missions, as well as an easy way to build and deploy those missions to OpenShift.

Fabric8 Launcher

The Fabric8 Launcher is the upstream project on which developers.redhat.com/launch is based.

Single-node OpenShift Cluster

An OpenShift cluster running on your machine using Minishift.

B.2. Terms specific to Fabric8 Launcher

Booster

A language-specific implementation of a particular mission on a particular runtime. Boosters are listed in a booster catalog.

For example, a booster is a web service with a REST API implemented using the Thorntail runtime.

Booster Catalog

A Git repository that contains information about boosters.

Mission

An application specification, for example a web service with a REST API.

Missions generally do not specify which language or platform they should run on; the description only contains the intended functionality.

Runtime

A platform that executes boosters. For example, Thorntail or Eclipse Vert.x.