Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Repo vm will not provision with an error about not enough space for pip to install ansible. #14

Open
eponafyrefly opened this issue Mar 25, 2021 · 1 comment · Fixed by #21 or #16

Comments

@eponafyrefly
Copy link

Environments: RHEL 8.4 and Fedora 33.
F33 running the following pkgs:

vagrant-libvirt-0.1.2-1.fc33.noarch
vagrant-2.2.9-3.fc33.noarch
virtualbox-guest-additions-6.1.18-1.fc33.x86_64
VirtualBox-kmodsrc-6.1.18-1.fc33.noarch
akmod-VirtualBox-6.1.18-1.fc33.x86_64
VirtualBox-server-6.1.18-1.fc33.x86_64
VirtualBox-6.1.18-1.fc33.x86_64
kmod-VirtualBox-5.11.8-200.fc33.x86_64-6.1.18-1.fc33.x86_64

Running 'vagrant up' results in:

==> repo: Running provisioner: shell...
    repo: Running: inline script
    repo: Updating Subscription Management repositories.
    repo: Unable to read consumer identity
    repo: 
    repo: This system is not registered to Red Hat Subscription Management. You can use subscription-manager to register.
    repo: Last metadata expiration check: 0:47:29 ago on Wed 24 Mar 2021 09:43:18 PM PDT.
    repo: epel-release-latest-7.noarch.rpm                 64 kB/s |  15 kB     00:00    
    repo: Package epel-release-7-13.noarch is already installed.
    repo: Dependencies resolved.
    repo: Nothing to do.
    repo: Complete!
    repo: Updating Subscription Management repositories.
    repo: Unable to read consumer identity
    repo: 
    repo: This system is not registered to Red Hat Subscription Management. You can use subscription-manager to register.
    repo: Last metadata expiration check: 0:47:30 ago on Wed 24 Mar 2021 09:43:18 PM PDT.
    repo: Package sshpass-1.06-1.el7.x86_64 is already installed.
    repo: Package python3-pip-9.0.3-13.el8.noarch is already installed.
    repo: Package python36-devel-3.6.8-1.module+el8+2710+846623d6.x86_64 is already installed.
    repo: Package httpd-2.4.37-30.module+el8.3.0+7001+0766b9e7.x86_64 is already installed.
    repo: Package vsftpd-3.0.3-28.el8.x86_64 is already installed.
    repo: Package createrepo_c-0.11.0-1.el8.x86_64 is already installed.
    repo: Dependencies resolved.
    repo: Nothing to do.
    repo: Complete!
==> repo: Running provisioner: shell...
    repo: Running: inline script
    repo: Requirement already satisfied: pip in /usr/local/lib/python3.6/site-packages (21.0.1)
    repo: Requirement already satisfied: pexpect in /usr/local/lib/python3.6/site-packages (4.8.0)
    repo: Requirement already satisfied: ptyprocess>=0.5 in /usr/local/lib/python3.6/site-packages (from pexpect) (0.7.0)
    repo: Collecting ansible
    repo:   Downloading ansible-3.1.0.tar.gz (31.1 MB)
    repo: ERROR: Could not install packages due to an OSError: [Errno 28] No space left on device: '/tmp/pip-install-y4vxk1eb/ansible_ac2c1320a7644857a445e84eeefeb766/ansible_collections/arista/eos/plugins/module_utils/network/eos/rm_templates/ospfv3.py'
The SSH command responded with a non-zero exit status. Vagrant
assumes that this means the command failed. The output for this command
should be in the log above. Please read the output to determine what
went wrong.

I have plenty of disk space available and have set my TMPDIR variable to /home:

[root@localhost rhce8env]# df -h
Filesystem      Size  Used Avail Use% Mounted on
devtmpfs         16G     0   16G   0% /dev
tmpfs            16G     0   16G   0% /dev/shm
tmpfs           6.3G  2.1M  6.3G   1% /run
/dev/nvme0n1p3  465G  168G  297G  37% /
/dev/nvme0n1p3  465G  168G  297G  37% /home
/dev/nvme0n1p2  976M  245M  665M  27% /boot
/dev/nvme0n1p1  599M   21M  579M   4% /boot/efi
tmpfs            16G   64K   16G   1% /tmp
tmpfs           3.2G  200K  3.2G   1% /run/user/1000

Please let me know if there's any other info I can provide. Thanks!

@eponafyrefly
Copy link
Author

This is an issue with the size of the repo VM provisioned. @glengib has described a workaround here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
1 participant