AzCopy "segmentation fault" error after copying file with bash script

111 Views Asked by At

I'm installing an AzCopy utility (to copy files from Azure Storage) to my local VM but can't run it due to "Segmentation fault" error.

To install AzCopy I'm using a script:

#!/bin/bash

cd /tmp/

# this just removes previous azcopy downloads if for some reason there are some
rm *azcopy* &> /dev/null

echo "Downloading azcopy..."
echo 
echo "-----------------------------------------------------------"
wget https://aka.ms/downloadazcopy-v10-linux && echo -n "Download complete, unpacking...  " || (echo "Error downloading the file - aborting"; exit 1)
checkstr=$(tar --exclude=*.txt -tf downloadazcopy-v10-linux | grep /azcopy)

if [[ $checkstr == *"/azcopy" ]];
then echo "azcopy file found - verication OK"
else (
  echo "ERROR - azcopy file missing - aborting"
  exit 1
)
fi

tar --strip-components=1 --exclude=*.txt -xzvf downloadazcopy-v10-linux & > /dev/null && echo "Unpacking OK" || (echo "Unpacking failed - aborting"; exit 1)

echo "Copying azcopy to /usr/local/bin ... "
sudo cp /tmp/azcopy /usr/local/bin/ && echo "Copying successful" || (echo "Error copying file - aborting"; exit 1)

echo "Making it executable ..."
chmod +x /usr/local/bin/azcopy && echo "Azcopy installed OK !" || (echo "Can't make the binary executable - aborting"; exit 1)

# cleaning up :)
rm -rf *azcopy* & > /dev/null

Installation is seemingly successful but when I try to run 'azcopy' command I always get "Segmentation fault" regardless of whether I run it with or without parameters - for example:

root@bbackup:/opt/githubbackup# azcopy
Segmentation fault
root@bbackup:/opt/githubbackup# azcopy login --identity
Segmentation fault
root@bbackup:/opt/githubbackup# azcopy asdfgh
Segmentation fault
root@bbackup:/opt/githubbackup# 

When (after the script has done it's job) I go to '/tmp/' and run the file with './azcopy' I get normal output and the utility works as expected. But when run it from '/usr/local/bin' there's "segmentation fault" error.

When I copy the file with 'cp /tmp/azcopy /usr/local/bin' file also works as supposed to.

Everything points to the fact of using the script.

P.S. - I have lot's of free disk space available.

I tried using different working directory ('/aztmp/') with no effect and tried changing files owner & group with 'chown root:root azcopy' to no avail too.

I could extract the file directly to the '/usr/local/bin' but since this is a production system I want an extra layer of security.

My suspicion is that file gets corrupted while copying from '/tmp/' but why ?

1

There are 1 best solutions below

0
Wojtek_B On

Solution found.

The issue was this part:

tar --strip-components=1 --exclude=*.txt -xzvf downloadazcopy-v10-linux & > /dev/null && echo "Unpacking OK" || (echo "Unpacking failed - aborting"; exit 1)

Note the & > part which instead of redirecting all output to /dev/null told the tar to run in the background (& sign) and then output everything from there to /dev/null.

After removing the space all is working :)

P.S. This is just something that my colleague at work found out but since he doesn't have SO account I'm posting this.