Skip to content
  1. Oct 25, 2024
  2. Oct 22, 2024
  3. Oct 16, 2024
  4. Oct 15, 2024
  5. Oct 10, 2024
  6. Sep 30, 2024
  7. Sep 18, 2024
  8. Sep 16, 2024
    • k8s-infra-cherrypick-robot's avatar
      [release-2.25] Refactor and expand download_hash.py (#11539) · 54a7ec56
      k8s-infra-cherrypick-robot authored
      
      
      * download_hash.py: generalized and data-driven
      
      The script is currently limited to one hardcoded URL for kubernetes
      related binaries, and a fixed set of architectures.
      
      The solution is three-fold:
      1. Use an url template dictionary for each download -> this allow to easily
         add support for new downloads.
      2. Source the architectures to search from the existing data
      3. Enumerate the existing versions in the data and start searching from
         the last one until no newer version is found (newer in the version
         order sense, irrespective of actual age)
      
      * download_hash.py: support for 'multi-hash' file + runc
      
      runc upstream does not provide one hash file per assets in their
      releases, but one file with all the hashes.
      To handle this (and/or any arbitrary format from upstreams), add a
      dictionary mapping the name of the download to a lambda function which
      transform the file provided by upstream into a dictionary of hashes,
      keyed by architecture.
      
      * download_hash: argument handling with argparse
      
      Allow the script to be called with a list of components, to only
      download new versions checksums for those.
      By default, we get new versions checksums for all supported (by the
      script) components.
      
      * download_hash: propagate new patch versions to all archs
      
      * download_hash: add support for 'simple hash' components
      
      * download_hash: support 'multi-hash' components
      
      * download_hash: document missing support
      
      * download_hash: use persistent session
      
      This allows to reuse http connection and be more efficient.
      From rough measuring it saves around 25-30% of execution time.
      
      * download_hash: cache request for 'multi-hash' files
      
      This avoid re-downloading the same file for different arch and
      re-parsing it
      
      * download_hash: document usage
      
      ---------
      
      Co-authored-by: default avatarMax Gautier <mg@max.gautier.name>
      54a7ec56
  9. Sep 05, 2024
  10. Aug 29, 2024
  11. Aug 20, 2024
  12. Aug 19, 2024
  13. Aug 15, 2024
  14. Aug 12, 2024
  15. Aug 09, 2024
  16. Aug 08, 2024
  17. Aug 06, 2024
  18. Jul 26, 2024
  19. Jul 19, 2024
  20. Jul 15, 2024
  21. Jul 12, 2024
    • k8s-infra-cherrypick-robot's avatar
      [release-2.25] pre-commit: make hooks self contained + ci config (#11359) · 9b122fb5
      k8s-infra-cherrypick-robot authored
      * Use alternate self-sufficient shellcheck precommit
      
      This pre-commit does not require prerequisite on the host, making it
      easier to run in CI workflows.
      
      * Switch to upstream ansible-lint pre-commit hook
      
      This way, the hook is self contained and does not depend on a previous
      virtualenv installation.
      
      * pre-commit: fix hooks dependencies
      
      - ansible-syntax-check
      - tox-inventory-builder
      - jinja-syntax-check
      
      * Fix ci-matrix pre-commit hook
      
      - Remove dependency of pydblite which fails to setup on recent pythons
      - Discard shell script and put everything into pre-commit
      
      * pre-commit: apply autofixes hooks and fix the rest manually
      
      - markdownlint (manual fix)
      - end-of-file-fixer
      - requirements-txt-fixer
      - trailing-whitespace
      
      * Convert check_typo to pre-commit + use maintained version
      
      client9/misspell is unmaintained, and has been forked by the golangci
      team, see https://github.com/client9/misspell/issues/197#issuecomment-1596318684.
      
      They haven't yet added a pre-commit config, so use my fork with the
      pre-commit hook config until the pull request is merged.
      
      * collection-build-install convert to pre-commit
      
      * Run pre-commit hooks in dynamic pipeline
      
      Use gitlab dynamic child pipelines feature to have one source of truth
      for the pre-commit jobs, the pre-commit config file.
      
      Use one cache per pre-commit. This should reduce the "fetching cache"
      time steps in gitlab-ci, since each job will have a separate cache with
      only its hook installed.
      
      * Remove gitlab-ci job done in pre-commit
      
      * pre-commit: adjust mardownlint default, md fixes
      
      Use a style file as recommended by upstream. This makes for only one
      source of truth.
      Conserve previous upstream default for MD007 (upstream default changed
      here https://github.com/markdownlint/markdownlint/pull/373
      
      )
      
      * Update pre-commit hooks
      
      ---------
      
      Co-authored-by: default avatarMax Gautier <mg@max.gautier.name>
      9b122fb5
  22. Jul 11, 2024
  23. Jul 01, 2024
  24. Jun 27, 2024
  25. Jun 10, 2024
  26. May 31, 2024
  27. May 30, 2024
  28. May 21, 2024
  29. May 20, 2024
Loading