Skip to main content

pull

Synopsis

ramalama pull [options] model

Description

Pull specified AI Model into local storage

Options

--authfile=password

path of the authentication file for OCI registries

--help, -h

Print usage message

--tls-verify=true

require HTTPS and verify certificates when contacting OCI registries

--verify=true

verify the model after pull, disable to allow pulling of models with different endianness

PROXY SUPPORT

RamaLama supports HTTP, HTTPS, and SOCKS proxies via standard environment variables:

  • HTTP_PROXY or http_proxy: Proxy for HTTP connections
  • HTTPS_PROXY or https_proxy: Proxy for HTTPS connections
  • NO_PROXY or no_proxy: Comma-separated list of hosts to bypass proxy

Example proxy URL formats:

  • HTTP/HTTPS: http://proxy.example.com:8080 or https://proxy.example.com:8443
  • SOCKS4: socks4://proxy.example.com:1080
  • SOCKS5: socks5://proxy.example.com:1080 or socks5h://proxy.example.com:1080 (DNS through proxy)

SOCKS proxy support requires the PySocks library (pip install PySocks).

See Also

ramalama(1)


Aug 2024, Originally compiled by Dan Walsh <dwalsh@redhat.com>