question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[1.0.2b2] Cannot upload packages to CDP (S3)

See original GitHub issue

Tried out the latest 1.0.1 beta but I’m still running into constant errors. I’ve included some output from autopkg runs which I’m hoping will help track the issue down.

Before running this I did the following: - Deleted the Policy from Jamf - Removed the latest Atom pkg from S3 - Removed the latest Atom pkg from my File Share Distribution Point

The following is a breakdown of what occurred after each run

Run 1: - Atom is downloaded and packaged - Atom pkg record is created in the Jamf server

Run 2: - Atom policy is created on the Jamf Server - Atom pkg is copied to the Local file share - Atom pkg is uploaded to S3 but the file is broken (only 70MB vs 132MB)

Run 3: (Before running I delete the pkg record from Jamf which also removes the broken file from S3 and from the Policy) - Package record is created in Jamf - No File is uploaded to S3 - No Package is attached to the Atom Policy

JSSImporter error log.txt

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:17 (3 by maintainers)

github_iconTop GitHub Comments

2reactions
mosencommented, Oct 21, 2018

I believe that somewhere between Jamf Pro 10.5.0 and 10.7.1 the implementation of /dbfileupload was changed. This is the URL that JSSImporter has always used to upload packages for CDP, JCDS and JDS.

Seeing as this option seems deprecated/end of life and we don’t have an official API, I would like to offer direct upload options for AWS S3 CDP and JCDS (which already exists).

The old distribution point “CDP” will remain for people running older versions of JAMF Pro.

0reactions
grahampughcommented, Nov 16, 2019

Closing this as it relates to the old version that used curl. If anyone is still seeing the issue, comment here and I’ll reopen.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Resolve errors uploading data to or downloading data from ...
I want to download data from Amazon Aurora and upload it to Amazon S3. How can I resolve an error I received while...
Read more >
circleci/aws-s3@3.1.1
Integrate Amazon AWS S3 with your CircleCI CI/CD pipeline easily with the aws-s3 orb.
Read more >
Configuring Access to S3 on CDP Private Cloud Base
Configuring Access to S3 on CDP Private Cloud Base. For Apache Hadoop applications to be able to interact with Amazon S3, they must...
Read more >
Amazon S3 Bucket Policies for Customer Data Platform
The group has the permissions set to manage the bucket but not allow for changing the BucketPolicy. Unfortunately the Principal element can't accommodate...
Read more >
Cannot upload the files into AWS S3 bucket with ACL pubic ...
I am trying to upload the file into s3 bucket but I am getting access denied error when I am trying to upload...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found