Persisting workflow data using artifacts
Artifacts allow you to share data between jobs in a workflow and store data once that workflow has completed.
GitHub Actions is currently in limited public beta and is subject to change. We strongly recommend that you do not use this feature for high-value workflows and content during the beta period. For more information about the beta, see "About GitHub Actions."
For more information about using GitHub Actions, see "Automating your workflow with GitHub Actions."
In this article
- About workflow artifacts
- Passing data between jobs in a workflow
- Sharing data between workflow runs
- Uploading build and test artifacts
About workflow artifacts
Artifacts allow you to persist data after a job has completed. An artifact is a file or collection of files produced during a workflow run. You can use artifacts to pass data between jobs in a workflow, or persist build and test output after a workflow run has ended. GitHub stores artifacts for 90 days for pushes and 30 days for pull requests. The retention period for a pull request restarts each time someone pushes new commits to the pull request.
You must upload artifacts during a workflow run. GitHub provides two actions that you can use to upload and download build artifacts. Files uploaded to a workflow run are archived using the
.zip format. For more information, see the actions/upload-artifact and download-artifact actions.
Each job in a workflow runs in a fresh instance of the virtual environment. When the job completes, the runner terminates and deletes the instance of the virtual environment.
To share data between jobs:
- You can upload the data before the job ends to save the file inside the workflow run. When you upload an archive, you must give it a name.
- After uploading a file, you can download it in another job within the same workflow run. When you download an archive, you can reference the archive by name. You can only download artifacts that were uploaded during the same workflow run.
Steps that are part of a job run in the same virtual environment, but run in their own individual processes. To pass data between steps in a job, you can use inputs and outputs. For more information about inputs and outputs, see "Metadata syntax for GitHub Actions."
These are some of the common build and test output artifacts that you might want to upload:
- Log files and core dumps
- Test results, failures, and screenshots
- Binary or archive files
- Stress test performance output and code coverage results
Passing data between jobs in a workflow
You can use the
download-archive actions to share data between jobs in a workflow. This example workflow illustrates how to pass data between jobs in the same workflow. For more information, see the actions/upload-artifact and download-artifact actions.
Jobs that are dependent on a previous job's artifacts must wait for the dependent job to complete successfully. This workflow uses the
needs keyword to ensure that
job_3 run sequentially. For example,
job_1 using the
needs: job_1 syntax.
Job 1 performs these steps:
- Adds the values 3 and 7 and outputs the value to a text file called
- Uses the
upload-artifactaction to upload the
math-homework.txtfile with the name
homework. The action places the file in a directory named
Job 2 uses the result in the previous job:
- Downloads the
homeworkartifact uploaded in the previous job. By default, the
download-artifactaction downloads artifacts to the workspace directory that the step is executing in. You can specify a different directory to download the action to using the
- Reads the value in the
homework/math-homework.txtfile and multiplies the value by 9 and then updates the file with the new value.
- Uploads the
math-homework.txtfile. This upload overwrites the previous upload because both of the uploads share the same name.
Job 3 displays the result uploaded in the previous job:
- Downloads the
- Prints the result of the math equation to the log.
The full math operation performed in this workflow example is
(3 + 7) x 9 = 90.
name: Share data between jobs on: [push] jobs: job_1: name: Add 3 and 7 runs-on: ubuntu-latest steps: - shell: bash run: | expr 3 + 7 > math-homework.txt - name: Upload math result for job 1 uses: actions/upload-artifact@v1 with: name: homework path: math-homework.txt job_2: name: Multiply by 9 needs: job_1 runs-on: windows-latest steps: - name: Download math result for job 1 uses: actions/download-artifact@v1 with: name: homework - shell: bash run: | value=`cat homework/math-homework.txt` expr $value \* 9 > homework/math-homework.txt - name: Upload math result for job 2 uses: actions/upload-artifact@v1 with: name: homework path: homework/math-homework.txt job_3: name: Display results needs: job_2 runs-on: macOS-latest steps: - name: Download math result for job 2 uses: actions/download-artifact@v1 with: name: homework - name: Print the final result shell: bash run: | value=`cat homework/math-homework.txt` echo The result is $value
Sharing data between workflow runs
After a workflow ends, you can download an archive of the uploaded artifacts on GitHub by finding the workflow run in the Actions tab. GitHub does not currently offer a REST API to retrieve uploaded artifacts.
If you need to access artifacts from a previously run workflow, you'll need to store the artifacts somewhere. For example, you could run a script at the end of your workflow to store build artifacts on Amazon S3 or Artifactory, and then use the storage service's API to retrieve those artifacts in a future workflow.
Uploading build and test artifacts
You can create a continuous integration (CI) workflow to build and test your code in a GitHub-hosted virtual environment. For more information about using GitHub Actions to perform CI, see About continuous integration.
The output of building and testing your code often produces files you can use to debug test failures and production code that you can deploy. You can configure a workflow to build and test the code pushed to your repository and report a success or failure status. You can upload the build and test output to use for deployments, debugging failed tests or crashes, and viewing test suite coverage.
You can use the
upload-archive action to upload artifacts. For more information, see the actions/upload-artifact action.
dist directory, you would deploy the files in the
dist directory to your web application server if all tests completed successfully.
|-- hello-world (repository) | └── dist | └── tests | └── src | └── sass/app.scss | └── app.ts | └── output | └── test |
This example shows you how to create a workflow for a Node.js project that
builds the code in the
src directory and runs the tests in the
tests directory. You can assume that running
npm test produces a code coverage report named
code-coverage.html stored in the
The workflow uploads the production artifacts in the
dist directory and the
code-coverage.html as two separate artifacts.
name: Node CI on: [push] jobs: build_and_test: runs-on: ubuntu-latest steps: - name: Checkout repository uses: actions/checkout@v1 - name: npm install, build, and test run: | npm install npm run build --if-present npm test - name: Archive production artifacts uses: actions/upload-artifact@v1 with: name: dist path: dist - name: Archive code coverage results uses: actions/upload-artifact@v1 with: name: code-coverage-report path: output/test/code-coverage.html