'Why is the bitbucket pipeline variable $BITBUCKET_REPO_SLUG not being converted to the repository name when building a ZIP file

I am creating a Bitbucket pipeline to deploy code from Bitbucket to an AWS EC2 instance. The steps required to do this in the pipeline are:

  1. Package all the code from Bitbucket into a zip file
  2. Upload that ZIP file to an S3 bucket
  3. Deploy the ZIP file to the EC2 instance using AWS Code Deploy

I want the zip file to be called <repository_name>.zip when uploading it to the S3 bucket. To achieve that I use the $BITBUCKET_REPO_SLUG pipeline variable and set up the first step of the pipeline as shown below, where applications is the folder inside the repository that I want to package in the zip file.

    staging:
      - step:
          name: Zip Code
          image: atlassian/default-image:3
          script:
            - zip -r "$BITBUCKET_REPO_SLUG.zip" "applications"
          artifacts:
            - "$BITBUCKET_REPO_SLUG.zip"

However, from the pipeline output below (under Build teardown) you can see that the $BITBUCKET_REPO_SLUG.zip is not changed to <repository_name>.zip as expected.

zip -r "$BITBUCKET_REPO_SLUG.zip" "applications"
+ zip -r "$BITBUCKET_REPO_SLUG.zip" "applications"
  adding: applications/ (stored 0%)
  adding: applications/configuration/ (stored 0%)
  adding: applications/configuration/trade_capture_trayport_private.ini (deflated 58%)
  adding: applications/trade_capture_etrm/ (stored 0%)
  adding: applications/trade_capture_etrm/trade_capture_trayport_private.ps1 (deflated 73%)
  adding: applications/trade_capture_etrm/etrm_private_alerting.ps1 (deflated 65%)
  adding: applications/trade_capture_etrm/tests/ (stored 0%)
  adding: applications/trade_capture_etrm/tests/trayport.tests.ps1 (deflated 93%)
  adding: applications/appspec.yml (deflated 70%)

Build teardown
  Searching for files matching artifact pattern $BITBUCKET_REPO_SLUG.zip
  Searching for test report files in directories named [test-results, failsafe-reports, test-reports, TestResults, surefire-reports] down to a depth of 4
  Finished scanning for test reports. Found 0 test report files.
  Merged test suites, total number tests is 0, with 0 failures and 0 errors.

In the next step, when uploading to S3, I am using the same approach to reference the zip file, as shown in this code

      - step:
          name: ⬆️ Upload to S3
          services:
            - docker
          oidc: true
          script:
            # Test upload
            - pipe: atlassian/aws-code-deploy:1.1.1
              variables:
                AWS_DEFAULT_REGION: $AWS_REGION
                AWS_OIDC_ROLE_ARN: $AWS_OIDC_ROLE_ARN
                COMMAND: 'upload'
                APPLICATION_NAME: $APPLICATION_NAME
                ZIP_FILE: "$BITBUCKET_REPO_SLUG.zip"
                S3_BUCKET: $S3_BUCKET_STAGING
                VERSION_LABEL: $BITBUCKET_REPO_SLUG

You can see from the output from the upload to S3 step that $BITBUCKET_REPO_SLUG.zip is correctly converted to <repository_name>.zip

INFO: Authenticating with a OpenID Connect (OIDC) Web Identity Provider
INFO: Executing the aws-ecr-push-image pipe...
INFO: Uploading powershell_trade_capture.zip to S3.
Traceback (most recent call last):
  File "/pipe.py", line 264, in <module>
    pipe.run()
  File "/pipe.py", line 254, in run
    self.upload_to_s3()
  File "/pipe.py", line 230, in upload_to_s3
    with open(self.get_variable('ZIP_FILE'), 'rb') as zip_file:
FileNotFoundError: [Errno 2] No such file or directory: 'powershell_trade_capture.zip'

Why is the pipeline variable $BITBUCKET_REPO_SLUG correctly converted to the value in the "Upload to S3" step but not converted to the repository name (and instead treated as a string) in the "Zip code" step?



Solution 1:[1]

The problem is that variable substitution is not currently supported in the artifact section of Bitbucket pipelines. See https://jira.atlassian.com/browse/BCLOUD-21666 for details which was created in Feb 2022.

The solution we found was to work around the problem and avoid the use of the artifact section. We did this by combining the "Zip Code" and "Upload to S3" sections of the pipeline. This meant that it was no longer necessary to use an artifact.

So this new section was set up as follows.

      - step:
          name: ?? Zip & Upload to S3
          services:
            - docker
          oidc: true
          image: atlassian/default-image:3
          script:
            # Build the zip file
            - zip -r $BITBUCKET_REPO_SLUG.zip "applications"
            # Upload the zip file to S3
            - pipe: atlassian/aws-code-deploy:1.1.1
              variables:
                AWS_DEFAULT_REGION: $AWS_REGION
                AWS_OIDC_ROLE_ARN: $AWS_OIDC_ROLE_ARN
                COMMAND: "upload"
                APPLICATION_NAME: $APPLICATION_NAME
                ZIP_FILE: $BITBUCKET_REPO_SLUG.zip
                S3_BUCKET: $S3_BUCKET_STAGING
                VERSION_LABEL: $BITBUCKET_REPO_SLUG

In this case there was no problem with the use of $BITBUCKET_REPO_SLUG and the pipeline was successful.

Solution 2:[2]

You don't need quotes for the zip or artifacts and I would assume it's not converting the variable correctly due to that.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 gerard
Solution 2 Randommm