comparison pipcl.py @ 39:a6bc019ac0b2 upstream

ADD: PyMuPDF v1.26.5: the original sdist.
author Franz Glasner <fzglas.hg@dom66.de>
date Sat, 11 Oct 2025 11:19:58 +0200
parents 1d09e1dec1d9
children 71bcc18e306f
comparison
equal deleted inserted replaced
2:b50eed0cc0ef 39:a6bc019ac0b2
1 ''' 1 '''
2 Python packaging operations, including PEP-517 support, for use by a `setup.py` 2 Python packaging operations, including PEP-517 support, for use by a `setup.py`
3 script. 3 script.
4 4
5 The intention is to take care of as many packaging details as possible so that 5 Overview:
6 setup.py contains only project-specific information, while also giving as much 6
7 flexibility as possible. 7 The intention is to take care of as many packaging details as possible so
8 8 that setup.py contains only project-specific information, while also giving
9 For example we provide a function `build_extension()` that can be used to build 9 as much flexibility as possible.
10 a SWIG extension, but we also give access to the located compiler/linker so 10
11 that a `setup.py` script can take over the details itself. 11 For example we provide a function `build_extension()` that can be used
12 12 to build a SWIG extension, but we also give access to the located
13 Run doctests with: `python -m doctest pipcl.py` 13 compiler/linker so that a `setup.py` script can take over the details
14 14 itself.
15 For Graal we require that PIPCL_GRAAL_PYTHON is set to non-graal Python (we 15
16 build for non-graal except with Graal Python's include paths and library 16 Doctests:
17 directory). 17 Doctest strings are provided in some comments.
18
19 Test in the usual way with:
20 python -m doctest pipcl.py
21
22 Test specific functions/classes with:
23 python pipcl.py --doctest run_if ...
24
25 If no functions or classes are specified, this tests everything.
26
27 Graal:
28 For Graal we require that PIPCL_GRAAL_PYTHON is set to non-graal Python (we
29 build for non-graal except with Graal Python's include paths and library
30 directory).
18 ''' 31 '''
19 32
20 import base64 33 import base64
21 import codecs 34 import codecs
35 import difflib
22 import glob 36 import glob
23 import hashlib 37 import hashlib
24 import inspect 38 import inspect
25 import io 39 import io
26 import os 40 import os
52 66
53 We also support basic command line handling for use 67 We also support basic command line handling for use
54 with a legacy (pre-PEP-517) pip, as implemented 68 with a legacy (pre-PEP-517) pip, as implemented
55 by legacy distutils/setuptools and described in: 69 by legacy distutils/setuptools and described in:
56 https://pip.pypa.io/en/stable/reference/build-system/setup-py/ 70 https://pip.pypa.io/en/stable/reference/build-system/setup-py/
71
72 The file pyproject.toml must exist; this is checked if/when fn_build() is
73 called.
57 74
58 Here is a `doctest` example of using pipcl to create a SWIG extension 75 Here is a `doctest` example of using pipcl to create a SWIG extension
59 module. Requires `swig`. 76 module. Requires `swig`.
60 77
61 Create an empty test directory: 78 Create an empty test directory:
319 336
320 wheel_compression = zipfile.ZIP_DEFLATED, 337 wheel_compression = zipfile.ZIP_DEFLATED,
321 wheel_compresslevel = None, 338 wheel_compresslevel = None,
322 ): 339 ):
323 ''' 340 '''
324 The initial args before `root` define the package 341 The initial args before `entry_points` define the
325 metadata and closely follow the definitions in: 342 package metadata and closely follow the definitions in:
326 https://packaging.python.org/specifications/core-metadata/ 343 https://packaging.python.org/specifications/core-metadata/
327 344
328 Args: 345 Args:
329 346
330 name: 347 name:
348 Used for metadata `Name`.
331 A string, the name of the Python package. 349 A string, the name of the Python package.
332 version: 350 version:
351 Used for metadata `Version`.
333 A string, the version of the Python package. Also see PEP-440 352 A string, the version of the Python package. Also see PEP-440
334 `Version Identification and Dependency Specification`. 353 `Version Identification and Dependency Specification`.
335 platform: 354 platform:
355 Used for metadata `Platform`.
336 A string or list of strings. 356 A string or list of strings.
337 supported_platform: 357 supported_platform:
358 Used for metadata `Supported-Platform`.
338 A string or list of strings. 359 A string or list of strings.
339 summary: 360 summary:
361 Used for metadata `Summary`.
340 A string, short description of the package. 362 A string, short description of the package.
341 description: 363 description:
364 Used for metadata `Description`.
342 A string. If contains newlines, a detailed description of the 365 A string. If contains newlines, a detailed description of the
343 package. Otherwise the path of a file containing the detailed 366 package. Otherwise the path of a file containing the detailed
344 description of the package. 367 description of the package.
345 description_content_type: 368 description_content_type:
369 Used for metadata `Description-Content-Type`.
346 A string describing markup of `description` arg. For example 370 A string describing markup of `description` arg. For example
347 `text/markdown; variant=GFM`. 371 `text/markdown; variant=GFM`.
348 keywords: 372 keywords:
373 Used for metadata `Keywords`.
349 A string containing comma-separated keywords. 374 A string containing comma-separated keywords.
350 home_page: 375 home_page:
376 Used for metadata `Home-page`.
351 URL of home page. 377 URL of home page.
352 download_url: 378 download_url:
379 Used for metadata `Download-URL`.
353 Where this version can be downloaded from. 380 Where this version can be downloaded from.
354 author: 381 author:
382 Used for metadata `Author`.
355 Author. 383 Author.
356 author_email: 384 author_email:
385 Used for metadata `Author-email`.
357 Author email. 386 Author email.
358 maintainer: 387 maintainer:
388 Used for metadata `Maintainer`.
359 Maintainer. 389 Maintainer.
360 maintainer_email: 390 maintainer_email:
391 Used for metadata `Maintainer-email`.
361 Maintainer email. 392 Maintainer email.
362 license: 393 license:
394 Used for metadata `License`.
363 A string containing the license text. Written into metadata 395 A string containing the license text. Written into metadata
364 file `COPYING`. Is also written into metadata itself if not 396 file `COPYING`. Is also written into metadata itself if not
365 multi-line. 397 multi-line.
366 classifier: 398 classifier:
399 Used for metadata `Classifier`.
367 A string or list of strings. Also see: 400 A string or list of strings. Also see:
368 401
369 * https://pypi.org/pypi?%3Aaction=list_classifiers 402 * https://pypi.org/pypi?%3Aaction=list_classifiers
370 * https://pypi.org/classifiers/ 403 * https://pypi.org/classifiers/
371 404
372 requires_dist: 405 requires_dist:
373 A string or list of strings. None items are ignored. Also see PEP-508. 406 Used for metadata `Requires-Dist`.
407 A string or list of strings, Python packages required
408 at runtime. None items are ignored.
374 requires_python: 409 requires_python:
410 Used for metadata `Requires-Python`.
375 A string or list of strings. 411 A string or list of strings.
376 requires_external: 412 requires_external:
413 Used for metadata `Requires-External`.
377 A string or list of strings. 414 A string or list of strings.
378 project_url: 415 project_url:
379 A string or list of strings, each of the form: `{name}, {url}`. 416 Used for metadata `Project-URL`.
417 A string or list of strings, each of the form: `{name},
418 {url}`.
380 provides_extra: 419 provides_extra:
420 Used for metadata `Provides-Extra`.
381 A string or list of strings. 421 A string or list of strings.
382 422
383 entry_points: 423 entry_points:
384 String or dict specifying *.dist-info/entry_points.txt, for 424 String or dict specifying *.dist-info/entry_points.txt, for
385 example: 425 example:
413 be the path to a file; a relative path is treated as relative 453 be the path to a file; a relative path is treated as relative
414 to `root`. If a `bytes` it is the contents of the file to be 454 to `root`. If a `bytes` it is the contents of the file to be
415 added. 455 added.
416 456
417 `to_` identifies what the file should be called within a wheel 457 `to_` identifies what the file should be called within a wheel
418 or when installing. If `to_` ends with `/`, the leaf of `from_` 458 or when installing. If `to_` is empty or `/` we set it to the
419 is appended to it (and `from_` must not be a `bytes`). 459 leaf of `from_` (`from_` must not be a `bytes`) - i.e. we place
460 the file in the root directory of the wheel; otherwise if
461 `to_` ends with `/` the leaf of `from_` is appended to it (and
462 `from_` must not be a `bytes`).
420 463
421 Initial `$dist-info/` in `_to` is replaced by 464 Initial `$dist-info/` in `_to` is replaced by
422 `{name}-{version}.dist-info/`; this is useful for license files 465 `{name}-{version}.dist-info/`; this is useful for license files
423 etc. 466 etc.
424 467
437 we copy `from_` to `{sitepackages}/{to_}`, where 480 we copy `from_` to `{sitepackages}/{to_}`, where
438 `sitepackages` is the installation directory, the 481 `sitepackages` is the installation directory, the
439 default being `sysconfig.get_path('platlib')` e.g. 482 default being `sysconfig.get_path('platlib')` e.g.
440 `myvenv/lib/python3.9/site-packages/`. 483 `myvenv/lib/python3.9/site-packages/`.
441 484
485 When calling this function, we assert that the file
486 pyproject.toml exists in the current directory. (We do this
487 here rather than in pipcl.Package's constructor, as otherwise
488 importing setup.py from non-package-related code could fail.)
489
442 fn_clean: 490 fn_clean:
443 A function taking a single arg `all_` that cleans generated 491 A function taking a single arg `all_` that cleans generated
444 files. `all_` is true iff `--all` is in argv. 492 files. `all_` is true iff `--all` is in argv.
445 493
446 For safety and convenience, can also returns a list of 494 For safety and convenience, can also returns a list of
455 as returned by `fn_build`. 503 as returned by `fn_build`.
456 504
457 It can be convenient to use `pipcl.git_items()`. 505 It can be convenient to use `pipcl.git_items()`.
458 506
459 The specification for sdists requires that the list contains 507 The specification for sdists requires that the list contains
460 `pyproject.toml`; we enforce this with a diagnostic rather than 508 `pyproject.toml`; we enforce this with a Python assert.
461 raising an exception, to allow legacy command-line usage.
462 509
463 tag_python: 510 tag_python:
464 First element of wheel tag defined in PEP-425. If None we use 511 First element of wheel tag defined in PEP-425. If None we use
465 `cp{version}`. 512 `cp{version}`.
466 513
526 assert_str_or_multi( requires_dist) 573 assert_str_or_multi( requires_dist)
527 assert_str( requires_python) 574 assert_str( requires_python)
528 assert_str_or_multi( requires_external) 575 assert_str_or_multi( requires_external)
529 assert_str_or_multi( project_url) 576 assert_str_or_multi( project_url)
530 assert_str_or_multi( provides_extra) 577 assert_str_or_multi( provides_extra)
578
579 assert re.match('^([A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9])\\Z', name, re.IGNORECASE), (
580 f'Invalid package name'
581 f' (https://packaging.python.org/en/latest/specifications/name-normalization/)'
582 f': {name!r}'
583 )
531 584
532 # https://packaging.python.org/en/latest/specifications/core-metadata/. 585 # https://packaging.python.org/en/latest/specifications/core-metadata/.
533 assert re.match('([A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9])$', name, re.IGNORECASE), \ 586 assert re.match('([A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9])$', name, re.IGNORECASE), \
534 f'Bad name: {name!r}' 587 f'Bad name: {name!r}'
535 588
600 f' wheel_directory={wheel_directory!r}' 653 f' wheel_directory={wheel_directory!r}'
601 f' config_settings={config_settings!r}' 654 f' config_settings={config_settings!r}'
602 f' metadata_directory={metadata_directory!r}' 655 f' metadata_directory={metadata_directory!r}'
603 ) 656 )
604 657
605 if sys.implementation.name == 'graalpy': 658 if os.environ.get('CIBUILDWHEEL') == '1':
659 # Don't special-case graal builds when running under cibuildwheel.
660 pass
661 elif sys.implementation.name == 'graalpy':
606 # We build for Graal by building a native Python wheel with Graal 662 # We build for Graal by building a native Python wheel with Graal
607 # Python's include paths and library directory. We then rename the 663 # Python's include paths and library directory. We then rename the
608 # wheel to contain graal's tag etc. 664 # wheel to contain graal's tag etc.
609 # 665 #
610 log0(f'### Graal build: deferring to cpython.') 666 log0(f'### Graal build: deferring to cpython.')
752 if inspect.signature(self.fn_sdist).parameters: 808 if inspect.signature(self.fn_sdist).parameters:
753 items = self.fn_sdist(config_settings) 809 items = self.fn_sdist(config_settings)
754 else: 810 else:
755 items = self.fn_sdist() 811 items = self.fn_sdist()
756 812
757 prefix = f'{_normalise(self.name)}-{self.version}' 813 prefix = f'{_normalise2(self.name)}-{self.version}'
758 os.makedirs(sdist_directory, exist_ok=True) 814 os.makedirs(sdist_directory, exist_ok=True)
759 tarpath = f'{sdist_directory}/{prefix}.tar.gz' 815 tarpath = f'{sdist_directory}/{prefix}.tar.gz'
760 log2(f'Creating sdist: {tarpath}') 816 log2(f'Creating sdist: {tarpath}')
761 817
762 with tarfile.open(tarpath, 'w:gz') as tar: 818 with tarfile.open(tarpath, 'w:gz') as tar:
794 if from_.startswith(f'{os.path.abspath(sdist_directory)}/'): 850 if from_.startswith(f'{os.path.abspath(sdist_directory)}/'):
795 # Source files should not be inside <sdist_directory>. 851 # Source files should not be inside <sdist_directory>.
796 assert 0, f'Path is inside sdist_directory={sdist_directory}: {from_!r}' 852 assert 0, f'Path is inside sdist_directory={sdist_directory}: {from_!r}'
797 assert os.path.exists(from_), f'Path does not exist: {from_!r}' 853 assert os.path.exists(from_), f'Path does not exist: {from_!r}'
798 assert os.path.isfile(from_), f'Path is not a file: {from_!r}' 854 assert os.path.isfile(from_), f'Path is not a file: {from_!r}'
799 if to_rel == 'pyproject.toml':
800 found_pyproject_toml = True
801 add(from_, to_rel) 855 add(from_, to_rel)
802 856 if to_rel == 'pyproject.toml':
803 if not found_pyproject_toml: 857 found_pyproject_toml = True
804 log0(f'Warning: no pyproject.toml specified.') 858
859 assert found_pyproject_toml, f'Cannot create sdist because file not specified: pyproject.toml'
805 860
806 # Always add a PKG-INFO file. 861 # Always add a PKG-INFO file.
807 add_string(self._metainfo(), 'PKG-INFO') 862 add_string(self._metainfo(), 'PKG-INFO')
808 863
809 if self.license: 864 if self.license:
824 def tag_python(self): 879 def tag_python(self):
825 ''' 880 '''
826 Get two-digit python version, e.g. 'cp3.8' for python-3.8.6. 881 Get two-digit python version, e.g. 'cp3.8' for python-3.8.6.
827 ''' 882 '''
828 if self.tag_python_: 883 if self.tag_python_:
829 return self.tag_python_ 884 ret = self.tag_python_
830 else: 885 else:
831 return 'cp' + ''.join(platform.python_version().split('.')[:2]) 886 ret = 'cp' + ''.join(platform.python_version().split('.')[:2])
887 assert '-' not in ret
888 return ret
832 889
833 def tag_abi(self): 890 def tag_abi(self):
834 ''' 891 '''
835 ABI tag. 892 ABI tag.
836 ''' 893 '''
882 ret2 = f'{m.group(1)}_0{m.group(2)}' 939 ret2 = f'{m.group(1)}_0{m.group(2)}'
883 log0(f'After macos patch, changing from {ret!r} to {ret2!r}.') 940 log0(f'After macos patch, changing from {ret!r} to {ret2!r}.')
884 ret = ret2 941 ret = ret2
885 942
886 log0( f'tag_platform(): returning {ret=}.') 943 log0( f'tag_platform(): returning {ret=}.')
944 assert '-' not in ret
887 return ret 945 return ret
888 946
889 def wheel_name(self): 947 def wheel_name(self):
890 return f'{_normalise(self.name)}-{self.version}-{self.tag_python()}-{self.tag_abi()}-{self.tag_platform()}.whl' 948 ret = f'{_normalise2(self.name)}-{self.version}-{self.tag_python()}-{self.tag_abi()}-{self.tag_platform()}.whl'
949 assert ret.count('-') == 4, f'Expected 4 dash characters in {ret=}.'
950 return ret
891 951
892 def wheel_name_match(self, wheel): 952 def wheel_name_match(self, wheel):
893 ''' 953 '''
894 Returns true if `wheel` matches our wheel. We basically require the 954 Returns true if `wheel` matches our wheel. We basically require the
895 name to be the same, except that we accept platform tags that contain 955 name to be the same, except that we accept platform tags that contain
914 # This wheel uses Python stable ABI same or older than ours, so 974 # This wheel uses Python stable ABI same or older than ours, so
915 # we can use it. 975 # we can use it.
916 log2(f'py_limited_api; {tag_python=} compatible with {self.tag_python()=}.') 976 log2(f'py_limited_api; {tag_python=} compatible with {self.tag_python()=}.')
917 py_limited_api_compatible = True 977 py_limited_api_compatible = True
918 978
919 log2(f'{_normalise(self.name) == name=}') 979 log2(f'{_normalise2(self.name) == name=}')
920 log2(f'{self.version == version=}') 980 log2(f'{self.version == version=}')
921 log2(f'{self.tag_python() == tag_python=} {self.tag_python()=} {tag_python=}') 981 log2(f'{self.tag_python() == tag_python=} {self.tag_python()=} {tag_python=}')
922 log2(f'{py_limited_api_compatible=}') 982 log2(f'{py_limited_api_compatible=}')
923 log2(f'{self.tag_abi() == tag_abi=}') 983 log2(f'{self.tag_abi() == tag_abi=}')
924 log2(f'{self.tag_platform() in tag_platform.split(".")=}') 984 log2(f'{self.tag_platform() in tag_platform.split(".")=}')
925 log2(f'{self.tag_platform()=}') 985 log2(f'{self.tag_platform()=}')
926 log2(f'{tag_platform.split(".")=}') 986 log2(f'{tag_platform.split(".")=}')
927 ret = (1 987 ret = (1
928 and _normalise(self.name) == name 988 and _normalise2(self.name) == name
929 and self.version == version 989 and self.version == version
930 and (self.tag_python() == tag_python or py_limited_api_compatible) 990 and (self.tag_python() == tag_python or py_limited_api_compatible)
931 and self.tag_abi() == tag_abi 991 and self.tag_abi() == tag_abi
932 and self.tag_platform() in tag_platform.split('.') 992 and self.tag_platform() in tag_platform.split('.')
933 ) 993 )
945 ret += f'{value}\n' 1005 ret += f'{value}\n'
946 return ret 1006 return ret
947 1007
948 def _call_fn_build( self, config_settings=None): 1008 def _call_fn_build( self, config_settings=None):
949 assert self.fn_build 1009 assert self.fn_build
1010 assert os.path.isfile('pyproject.toml'), (
1011 'Cannot create package because file does not exist: pyproject.toml'
1012 )
950 log2(f'calling self.fn_build={self.fn_build}') 1013 log2(f'calling self.fn_build={self.fn_build}')
951 if inspect.signature(self.fn_build).parameters: 1014 if inspect.signature(self.fn_build).parameters:
952 ret = self.fn_build(config_settings) 1015 ret = self.fn_build(config_settings)
953 else: 1016 else:
954 ret = self.fn_build() 1017 ret = self.fn_build()
955 assert isinstance( ret, (list, tuple)), \ 1018 assert isinstance( ret, (list, tuple)), \
956 f'Expected list/tuple from {self.fn_build} but got: {ret!r}' 1019 f'Expected list/tuple from {self.fn_build} but got: {ret!r}'
1020
1021 # Check that any extensions that we have built, have same
1022 # py_limited_api value. If package is marked with py_limited_api=True
1023 # then non-py_limited_api extensions seem to fail at runtime on
1024 # Windows.
1025 #
1026 # (We could possibly allow package py_limited_api=False and extensions
1027 # py_limited_api=True, but haven't tested this, and it seems simpler to
1028 # be strict.)
1029 for item in ret:
1030 from_, (to_abs, to_rel) = self._fromto(item)
1031 from_abs = os.path.abspath(from_)
1032 is_py_limited_api = _extensions_to_py_limited_api.get(from_abs)
1033 if is_py_limited_api is not None:
1034 assert bool(self.py_limited_api) == bool(is_py_limited_api), (
1035 f'Extension was built with'
1036 f' py_limited_api={is_py_limited_api} but pipcl.Package'
1037 f' name={self.name!r} has'
1038 f' py_limited_api={self.py_limited_api}:'
1039 f' {from_abs!r}'
1040 )
1041
957 return ret 1042 return ret
958 1043
959 1044
960 def _argv_clean(self, all_): 1045 def _argv_clean(self, all_):
961 ''' 1046 '''
1050 Called by `handle_argv()`. There doesn't seem to be any documentation 1135 Called by `handle_argv()`. There doesn't seem to be any documentation
1051 for `setup.py dist_info`, but it appears to be like `egg_info` except 1136 for `setup.py dist_info`, but it appears to be like `egg_info` except
1052 it writes to a slightly different directory. 1137 it writes to a slightly different directory.
1053 ''' 1138 '''
1054 if root is None: 1139 if root is None:
1055 root = f'{self.name}-{self.version}.dist-info' 1140 root = f'{normalise2(self.name)}-{self.version}.dist-info'
1056 self._write_info(f'{root}/METADATA') 1141 self._write_info(f'{root}/METADATA')
1057 if self.license: 1142 if self.license:
1058 with open( f'{root}/COPYING', 'w') as f: 1143 with open( f'{root}/COPYING', 'w') as f:
1059 f.write( self.license) 1144 f.write( self.license)
1060 1145
1338 f' tag_platform={self.tag_platform_!r}' 1423 f' tag_platform={self.tag_platform_!r}'
1339 '}' 1424 '}'
1340 ) 1425 )
1341 1426
1342 def _dist_info_dir( self): 1427 def _dist_info_dir( self):
1343 return f'{_normalise(self.name)}-{self.version}.dist-info' 1428 return f'{_normalise2(self.name)}-{self.version}.dist-info'
1344 1429
1345 def _metainfo(self): 1430 def _metainfo(self):
1346 ''' 1431 '''
1347 Returns text for `.egg-info/PKG-INFO` file, or `PKG-INFO` in an sdist 1432 Returns text for `.egg-info/PKG-INFO` file, or `PKG-INFO` in an sdist
1348 `.tar.gz` file, or `...dist-info/METADATA` in a wheel. 1433 `.tar.gz` file, or `...dist-info/METADATA` in a wheel.
1444 1529
1445 If `p` is a string we convert to `(p, p)`. Otherwise we assert that 1530 If `p` is a string we convert to `(p, p)`. Otherwise we assert that
1446 `p` is a tuple `(from_, to_)` where `from_` is str/bytes and `to_` is 1531 `p` is a tuple `(from_, to_)` where `from_` is str/bytes and `to_` is
1447 str. If `from_` is a bytes it is contents of file to add, otherwise the 1532 str. If `from_` is a bytes it is contents of file to add, otherwise the
1448 path of an existing file; non-absolute paths are assumed to be relative 1533 path of an existing file; non-absolute paths are assumed to be relative
1449 to `self.root`. If `to_` is empty or ends with `/`, we append the leaf 1534 to `self.root`.
1450 of `from_` (which must be a str). 1535
1536 If `to_` is empty or `/` we set it to the leaf of `from_` (which must
1537 be a str) - i.e. we place the file in the root directory of the wheel;
1538 otherwise if `to_` ends with `/` we append the leaf of `from_` (which
1539 must be a str).
1451 1540
1452 If `to_` starts with `$dist-info/`, we replace this with 1541 If `to_` starts with `$dist-info/`, we replace this with
1453 `self._dist_info_dir()`. 1542 `self._dist_info_dir()`.
1454 1543
1455 If `to_` starts with `$data/`, we replace this with 1544 If `to_` starts with `$data/`, we replace this with
1465 assert isinstance(p, tuple) and len(p) == 2 1554 assert isinstance(p, tuple) and len(p) == 2
1466 1555
1467 from_, to_ = p 1556 from_, to_ = p
1468 assert isinstance(from_, (str, bytes)) 1557 assert isinstance(from_, (str, bytes))
1469 assert isinstance(to_, str) 1558 assert isinstance(to_, str)
1470 if to_.endswith('/') or to_=='': 1559 if to_ == '/' or to_ == '':
1560 to_ = os.path.basename(from_)
1561 elif to_.endswith('/'):
1471 to_ += os.path.basename(from_) 1562 to_ += os.path.basename(from_)
1472 prefix = '$dist-info/' 1563 prefix = '$dist-info/'
1473 if to_.startswith( prefix): 1564 if to_.startswith( prefix):
1474 to_ = f'{self._dist_info_dir()}/{to_[ len(prefix):]}' 1565 to_ = f'{self._dist_info_dir()}/{to_[ len(prefix):]}'
1475 prefix = '$data/' 1566 prefix = '$data/'
1476 if to_.startswith( prefix): 1567 if to_.startswith( prefix):
1477 to_ = f'{self.name}-{self.version}.data/{to_[ len(prefix):]}' 1568 to_ = f'{_normalise2(self.name)}-{self.version}.data/{to_[ len(prefix):]}'
1478 if isinstance(from_, str): 1569 if isinstance(from_, str):
1479 from_, _ = self._path_relative_to_root( from_, assert_within_root=False) 1570 from_, _ = self._path_relative_to_root( from_, assert_within_root=False)
1480 to_ = self._path_relative_to_root(to_) 1571 to_ = self._path_relative_to_root(to_)
1481 assert isinstance(from_, (str, bytes)) 1572 assert isinstance(from_, (str, bytes))
1482 log2(f'returning {from_=} {to_=}') 1573 log2(f'returning {from_=} {to_=}')
1483 return from_, to_ 1574 return from_, to_
1484 1575
1576 _extensions_to_py_limited_api = dict()
1485 1577
1486 def build_extension( 1578 def build_extension(
1487 name, 1579 name,
1488 path_i, 1580 path_i,
1489 outdir, 1581 outdir,
1582 *,
1490 builddir=None, 1583 builddir=None,
1491 includes=None, 1584 includes=None,
1492 defines=None, 1585 defines=None,
1493 libpaths=None, 1586 libpaths=None,
1494 libs=None, 1587 libs=None,
1496 debug=False, 1589 debug=False,
1497 compiler_extra='', 1590 compiler_extra='',
1498 linker_extra='', 1591 linker_extra='',
1499 swig=None, 1592 swig=None,
1500 cpp=True, 1593 cpp=True,
1594 source_extra=None,
1501 prerequisites_swig=None, 1595 prerequisites_swig=None,
1502 prerequisites_compile=None, 1596 prerequisites_compile=None,
1503 prerequisites_link=None, 1597 prerequisites_link=None,
1504 infer_swig_includes=True, 1598 infer_swig_includes=True,
1505 py_limited_api=False, 1599 py_limited_api=False,
1537 `/LIBPATH:` on Windows or `-L` on Unix. 1631 `/LIBPATH:` on Windows or `-L` on Unix.
1538 libs 1632 libs
1539 A string, or a sequence of library names. Each item is prefixed 1633 A string, or a sequence of library names. Each item is prefixed
1540 with `-l` on non-Windows. 1634 with `-l` on non-Windows.
1541 optimise: 1635 optimise:
1542 Whether to use compiler optimisations. 1636 Whether to use compiler optimisations and define NDEBUG.
1543 debug: 1637 debug:
1544 Whether to build with debug symbols. 1638 Whether to build with debug symbols.
1545 compiler_extra: 1639 compiler_extra:
1546 Extra compiler flags. Can be None. 1640 Extra compiler flags. Can be None.
1547 linker_extra: 1641 linker_extra:
1548 Extra linker flags. Can be None. 1642 Extra linker flags. Can be None.
1549 swig: 1643 swig:
1550 Swig command; if false we use 'swig'. 1644 Swig command; if false we use 'swig'.
1551 cpp: 1645 cpp:
1552 If true we tell SWIG to generate C++ code instead of C. 1646 If true we tell SWIG to generate C++ code instead of C.
1647 source_extra:
1648 Extra source files to build into the shared library,
1553 prerequisites_swig: 1649 prerequisites_swig:
1554 prerequisites_compile: 1650 prerequisites_compile:
1555 prerequisites_link: 1651 prerequisites_link:
1556 1652
1557 [These are mainly for use on Windows. On other systems we 1653 [These are mainly for use on Windows. On other systems we
1582 infer_swig_includes: 1678 infer_swig_includes:
1583 If true, we extract `-I<path>` and `-I <path>` args from 1679 If true, we extract `-I<path>` and `-I <path>` args from
1584 `compile_extra` (also `/I` on windows) and use them with swig so 1680 `compile_extra` (also `/I` on windows) and use them with swig so
1585 that it can see the same header files as C/C++. This is useful 1681 that it can see the same header files as C/C++. This is useful
1586 when using enviromment variables such as `CC` and `CXX` to set 1682 when using enviromment variables such as `CC` and `CXX` to set
1587 `compile_extra. 1683 `compile_extra`.
1588 py_limited_api: 1684 py_limited_api:
1589 If true we build for current Python's limited API / stable ABI. 1685 If true we build for current Python's limited API / stable ABI.
1686
1687 Note that we will assert false if this extension is added to a
1688 pipcl.Package that has a different <py_limited_api>, because
1689 on Windows importing a non-py_limited_api extension inside a
1690 py_limited=True package fails.
1590 1691
1591 Returns the leafname of the generated library file within `outdir`, e.g. 1692 Returns the leafname of the generated library file within `outdir`, e.g.
1592 `_{name}.so` on Unix or `_{name}.cp311-win_amd64.pyd` on Windows. 1693 `_{name}.so` on Unix or `_{name}.cp311-win_amd64.pyd` on Windows.
1593 ''' 1694 '''
1594 if compiler_extra is None: 1695 if compiler_extra is None:
1597 linker_extra = '' 1698 linker_extra = ''
1598 if builddir is None: 1699 if builddir is None:
1599 builddir = outdir 1700 builddir = outdir
1600 if not swig: 1701 if not swig:
1601 swig = 'swig' 1702 swig = 'swig'
1703
1704 if source_extra is None:
1705 source_extra = list()
1706 if isinstance(source_extra, str):
1707 source_extra = [source_extra]
1708
1602 includes_text = _flags( includes, '-I') 1709 includes_text = _flags( includes, '-I')
1603 defines_text = _flags( defines, '-D') 1710 defines_text = _flags( defines, '-D')
1604 libpaths_text = _flags( libpaths, '/LIBPATH:', '"') if windows() else _flags( libpaths, '-L') 1711 libpaths_text = _flags( libpaths, '/LIBPATH:', '"') if windows() else _flags( libpaths, '-L')
1605 libs_text = _flags( libs, '' if windows() else '-l') 1712 libs_text = _flags( libs, '' if windows() else '-l')
1606 path_cpp = f'{builddir}/{os.path.basename(path_i)}' 1713 path_cpp = f'{builddir}/{os.path.basename(path_i)}'
1607 path_cpp += '.cpp' if cpp else '.c' 1714 path_cpp += '.cpp' if cpp else '.c'
1608 os.makedirs( outdir, exist_ok=True) 1715 os.makedirs( outdir, exist_ok=True)
1609 1716
1610 # Run SWIG. 1717 # Run SWIG.
1611 1718 #
1612 if infer_swig_includes: 1719 if infer_swig_includes:
1613 # Extract include flags from `compiler_extra`. 1720 # Extract include flags from `compiler_extra`.
1614 swig_includes_extra = '' 1721 swig_includes_extra = ''
1615 compiler_extra_items = compiler_extra.split() 1722 compiler_extra_items = shlex.split(compiler_extra)
1616 i = 0 1723 i = 0
1617 while i < len(compiler_extra_items): 1724 while i < len(compiler_extra_items):
1618 item = compiler_extra_items[i] 1725 item = compiler_extra_items[i]
1619 # Swig doesn't seem to like a space after `I`. 1726 # Swig doesn't seem to like a space after `I`.
1620 if item == '-I' or (windows() and item == '/I'): 1727 if item == '-I' or (windows() and item == '/I'):
1645 path_i, 1752 path_i,
1646 prerequisites_swig, 1753 prerequisites_swig,
1647 prerequisites_swig2, 1754 prerequisites_swig2,
1648 ) 1755 )
1649 1756
1650 so_suffix = _so_suffix(use_so_versioning = not py_limited_api) 1757 if pyodide():
1758 so_suffix = '.so'
1759 log0(f'pyodide: PEP-3149 suffix untested, so omitting. {_so_suffix()=}.')
1760 else:
1761 so_suffix = _so_suffix(use_so_versioning = not py_limited_api)
1651 path_so_leaf = f'_{name}{so_suffix}' 1762 path_so_leaf = f'_{name}{so_suffix}'
1652 path_so = f'{outdir}/{path_so_leaf}' 1763 path_so = f'{outdir}/{path_so_leaf}'
1653 1764
1654 py_limited_api2 = current_py_limited_api() if py_limited_api else None 1765 py_limited_api2 = current_py_limited_api() if py_limited_api else None
1655 1766
1767 compiler_command, pythonflags = base_compiler(cpp=cpp)
1768 linker_command, _ = base_linker(cpp=cpp)
1769 # setuptools on Linux seems to use slightly different compile flags:
1770 #
1771 # -fwrapv -O3 -Wall -O2 -g0 -DPY_CALL_TRAMPOLINE
1772 #
1773
1774 general_flags = ''
1656 if windows(): 1775 if windows():
1657 path_obj = f'{path_so}.obj'
1658
1659 permissive = '/permissive-' 1776 permissive = '/permissive-'
1660 EHsc = '/EHsc' 1777 EHsc = '/EHsc'
1661 T = '/Tp' if cpp else '/Tc' 1778 T = '/Tp' if cpp else '/Tc'
1662 optimise2 = '/DNDEBUG /O2' if optimise else '/D_DEBUG' 1779 optimise2 = '/DNDEBUG /O2' if optimise else '/D_DEBUG'
1663 debug2 = '' 1780 debug2 = '/Zi' if debug else ''
1781 py_limited_api3 = f'/DPy_LIMITED_API={py_limited_api2}' if py_limited_api2 else ''
1782
1783 else:
1664 if debug: 1784 if debug:
1665 debug2 = '/Zi' # Generate .pdb. 1785 general_flags += '/Zi' if windows() else ' -g'
1666 # debug2 = '/Z7' # Embed debug info in .obj files. 1786 if optimise:
1667 1787 general_flags += ' /DNDEBUG /O2' if windows() else ' -O2 -DNDEBUG'
1668 py_limited_api3 = f'/DPy_LIMITED_API={py_limited_api2}' if py_limited_api2 else '' 1788
1669 1789 py_limited_api3 = f'-DPy_LIMITED_API={py_limited_api2}' if py_limited_api2 else ''
1670 # As of 2023-08-23, it looks like VS tools create slightly 1790
1671 # .dll's each time, even with identical inputs. 1791 if windows():
1792 pass
1793 elif darwin():
1794 # MacOS's linker does not like `-z origin`.
1795 rpath_flag = "-Wl,-rpath,@loader_path/"
1796 # Avoid `Undefined symbols for ... "_PyArg_UnpackTuple" ...'.
1797 general_flags += ' -undefined dynamic_lookup'
1798 elif pyodide():
1799 # Setting `-Wl,-rpath,'$ORIGIN',-z,origin` gives:
1800 # emcc: warning: ignoring unsupported linker flag: `-rpath` [-Wlinkflags]
1801 # wasm-ld: error: unknown -z value: origin
1672 # 1802 #
1673 # Some info about this is at: 1803 rpath_flag = "-Wl,-rpath,'$ORIGIN'"
1674 # https://nikhilism.com/post/2020/windows-deterministic-builds/. 1804 else:
1675 # E.g. an undocumented linker flag `/Brepro`. 1805 rpath_flag = "-Wl,-rpath,'$ORIGIN',-z,origin"
1676 # 1806
1677 1807 # Fun fact - on Linux, if the -L and -l options are before '{path_cpp}'
1678 command, pythonflags = base_compiler(cpp=cpp) 1808 # they seem to be ignored...
1679 command = f''' 1809 #
1680 {command} 1810 path_os = list()
1681 # General: 1811
1682 /c # Compiles without linking. 1812 for path_source in [path_cpp] + source_extra:
1683 {EHsc} # Enable "Standard C++ exception handling". 1813 path_o = f'{path_source}.obj' if windows() else f'{path_source}.o'
1684 1814 path_os.append(f' {path_o}')
1685 #/MD # Creates a multithreaded DLL using MSVCRT.lib. 1815
1686 {'/MDd' if debug else '/MD'} 1816 prerequisites_path = f'{path_o}.d'
1687 1817
1688 # Input/output files: 1818 if windows():
1689 {T}{path_cpp} # /Tp specifies C++ source file. 1819 compiler_command2 = f'''
1690 /Fo{path_obj} # Output file. codespell:ignore 1820 {compiler_command}
1691 1821 # General:
1692 # Include paths: 1822 /c # Compiles without linking.
1693 {includes_text} 1823 {EHsc} # Enable "Standard C++ exception handling".
1694 {pythonflags.includes} # Include path for Python headers. 1824
1695 1825 #/MD # Creates a multithreaded DLL using MSVCRT.lib.
1696 # Code generation: 1826 {'/MDd' if debug else '/MD'}
1697 {optimise2} 1827
1698 {debug2} 1828 # Input/output files:
1699 {permissive} # Set standard-conformance mode. 1829 {T}{path_source} # /Tp specifies C++ source file.
1700 1830 /Fo{path_o} # Output file. codespell:ignore
1701 # Diagnostics: 1831
1702 #/FC # Display full path of source code files passed to cl.exe in diagnostic text. 1832 # Include paths:
1703 /W3 # Sets which warning level to output. /W3 is IDE default. 1833 {includes_text}
1704 /diagnostics:caret # Controls the format of diagnostic messages. 1834 {pythonflags.includes} # Include path for Python headers.
1705 /nologo # 1835
1706 1836 # Code generation:
1707 {defines_text} 1837 {optimise2}
1708 {compiler_extra} 1838 {debug2}
1709 1839 {permissive} # Set standard-conformance mode.
1710 {py_limited_api3} 1840
1711 ''' 1841 # Diagnostics:
1712 run_if( command, path_obj, path_cpp, prerequisites_compile) 1842 #/FC # Display full path of source code files passed to cl.exe in diagnostic text.
1713 1843 /W3 # Sets which warning level to output. /W3 is IDE default.
1714 command, pythonflags = base_linker(cpp=cpp) 1844 /diagnostics:caret # Controls the format of diagnostic messages.
1845 /nologo #
1846
1847 {defines_text}
1848 {compiler_extra}
1849
1850 {py_limited_api3}
1851 '''
1852
1853 else:
1854 compiler_command2 = f'''
1855 {compiler_command}
1856 -fPIC
1857 {general_flags.strip()}
1858 {pythonflags.includes}
1859 {includes_text}
1860 {defines_text}
1861 -MD -MF {prerequisites_path}
1862 -c {path_source}
1863 -o {path_o}
1864 {compiler_extra}
1865 {py_limited_api3}
1866 '''
1867 run_if(
1868 compiler_command2,
1869 path_o,
1870 path_source,
1871 [path_source] + _get_prerequisites(prerequisites_path),
1872 )
1873
1874 # Link
1875 prerequisites_path = f'{path_so}.d'
1876 if windows():
1715 debug2 = '/DEBUG' if debug else '' 1877 debug2 = '/DEBUG' if debug else ''
1716 base, _ = os.path.splitext(path_so_leaf) 1878 base, _ = os.path.splitext(path_so_leaf)
1717 command = f''' 1879 command2 = f'''
1718 {command} 1880 {linker_command}
1719 /DLL # Builds a DLL. 1881 /DLL # Builds a DLL.
1720 /EXPORT:PyInit__{name} # Exports a function. 1882 /EXPORT:PyInit__{name} # Exports a function.
1721 /IMPLIB:{base}.lib # Overrides the default import library name. 1883 /IMPLIB:{base}.lib # Overrides the default import library name.
1722 {libpaths_text} 1884 {libpaths_text}
1723 {pythonflags.ldflags} 1885 {pythonflags.ldflags}
1724 /OUT:{path_so} # Specifies the output file name. 1886 /OUT:{path_so} # Specifies the output file name.
1725 {debug2} 1887 {debug2}
1726 /nologo 1888 /nologo
1727 {libs_text} 1889 {libs_text}
1728 {path_obj} 1890 {' '.join(path_os)}
1729 {linker_extra} 1891 {linker_extra}
1730 ''' 1892 '''
1731 run_if( command, path_so, path_obj, prerequisites_link) 1893 elif pyodide():
1732 1894 command2 = f'''
1895 {linker_command}
1896 -MD -MF {prerequisites_path}
1897 -o {path_so}
1898 {' '.join(path_os)}
1899 {libpaths_text}
1900 {libs_text}
1901 {linker_extra}
1902 {pythonflags.ldflags}
1903 {rpath_flag}
1904 '''
1733 else: 1905 else:
1734 1906 command2 = f'''
1735 # Not Windows. 1907 {linker_command}
1736 # 1908 -shared
1737 command, pythonflags = base_compiler(cpp=cpp) 1909 {general_flags.strip()}
1738 1910 -MD -MF {prerequisites_path}
1739 # setuptools on Linux seems to use slightly different compile flags: 1911 -o {path_so}
1740 # 1912 {' '.join(path_os)}
1741 # -fwrapv -O3 -Wall -O2 -g0 -DPY_CALL_TRAMPOLINE 1913 {libpaths_text}
1742 # 1914 {libs_text}
1743 1915 {linker_extra}
1744 general_flags = '' 1916 {pythonflags.ldflags}
1745 if debug: 1917 {rpath_flag}
1746 general_flags += ' -g' 1918 {py_limited_api3}
1747 if optimise: 1919 '''
1748 general_flags += ' -O2 -DNDEBUG' 1920 link_was_run = run_if(
1749 1921 command2,
1750 py_limited_api3 = f'-DPy_LIMITED_API={py_limited_api2}' if py_limited_api2 else '' 1922 path_so,
1751 1923 path_cpp,
1752 if darwin(): 1924 *path_os,
1753 # MacOS's linker does not like `-z origin`. 1925 *_get_prerequisites(f'{path_so}.d'),
1754 rpath_flag = "-Wl,-rpath,@loader_path/" 1926 )
1755 1927
1756 # Avoid `Undefined symbols for ... "_PyArg_UnpackTuple" ...'. 1928 if link_was_run and darwin():
1757 general_flags += ' -undefined dynamic_lookup' 1929 # We need to patch up references to shared libraries in `libs`.
1758 elif pyodide(): 1930 sublibraries = list()
1759 # Setting `-Wl,-rpath,'$ORIGIN',-z,origin` gives: 1931 for lib in () if libs is None else libs:
1760 # emcc: warning: ignoring unsupported linker flag: `-rpath` [-Wlinkflags] 1932 for libpath in libpaths:
1761 # wasm-ld: error: unknown -z value: origin 1933 found = list()
1762 # 1934 for suffix in '.so', '.dylib':
1763 log0(f'pyodide: PEP-3149 suffix untested, so omitting. {_so_suffix()=}.') 1935 path = f'{libpath}/lib{os.path.basename(lib)}{suffix}'
1764 path_so_leaf = f'_{name}.so' 1936 if os.path.exists( path):
1765 path_so = f'{outdir}/{path_so_leaf}' 1937 found.append( path)
1766 1938 if found:
1767 rpath_flag = '' 1939 assert len(found) == 1, f'More than one file matches lib={lib!r}: {found}'
1768 else: 1940 sublibraries.append( found[0])
1769 rpath_flag = "-Wl,-rpath,'$ORIGIN',-z,origin" 1941 break
1770 path_so = f'{outdir}/{path_so_leaf}' 1942 else:
1771 # Fun fact - on Linux, if the -L and -l options are before '{path_cpp}' 1943 log2(f'Warning: can not find path of lib={lib!r} in libpaths={libpaths}')
1772 # they seem to be ignored... 1944 macos_patch( path_so, *sublibraries)
1773 #
1774 prerequisites = list()
1775
1776 if pyodide():
1777 # Looks like pyodide's `cc` can't compile and link in one invocation.
1778 prerequisites_compile_path = f'{path_cpp}.o.d'
1779 prerequisites += _get_prerequisites( prerequisites_compile_path)
1780 command = f'''
1781 {command}
1782 -fPIC
1783 {general_flags.strip()}
1784 {pythonflags.includes}
1785 {includes_text}
1786 {defines_text}
1787 -MD -MF {prerequisites_compile_path}
1788 -c {path_cpp}
1789 -o {path_cpp}.o
1790 {compiler_extra}
1791 {py_limited_api3}
1792 '''
1793 prerequisites_link_path = f'{path_cpp}.o.d'
1794 prerequisites += _get_prerequisites( prerequisites_link_path)
1795 ld, _ = base_linker(cpp=cpp)
1796 command += f'''
1797 && {ld}
1798 {path_cpp}.o
1799 -o {path_so}
1800 -MD -MF {prerequisites_link_path}
1801 {rpath_flag}
1802 {libpaths_text}
1803 {libs_text}
1804 {linker_extra}
1805 {pythonflags.ldflags}
1806 '''
1807 else:
1808 # We use compiler to compile and link in one command.
1809 prerequisites_path = f'{path_so}.d'
1810 prerequisites = _get_prerequisites(prerequisites_path)
1811
1812 command = f'''
1813 {command}
1814 -fPIC
1815 -shared
1816 {general_flags.strip()}
1817 {pythonflags.includes}
1818 {includes_text}
1819 {defines_text}
1820 {path_cpp}
1821 -MD -MF {prerequisites_path}
1822 -o {path_so}
1823 {compiler_extra}
1824 {libpaths_text}
1825 {linker_extra}
1826 {pythonflags.ldflags}
1827 {libs_text}
1828 {rpath_flag}
1829 {py_limited_api3}
1830 '''
1831 command_was_run = run_if(
1832 command,
1833 path_so,
1834 path_cpp,
1835 prerequisites_compile,
1836 prerequisites_link,
1837 prerequisites,
1838 )
1839
1840 if command_was_run and darwin():
1841 # We need to patch up references to shared libraries in `libs`.
1842 sublibraries = list()
1843 for lib in () if libs is None else libs:
1844 for libpath in libpaths:
1845 found = list()
1846 for suffix in '.so', '.dylib':
1847 path = f'{libpath}/lib{os.path.basename(lib)}{suffix}'
1848 if os.path.exists( path):
1849 found.append( path)
1850 if found:
1851 assert len(found) == 1, f'More than one file matches lib={lib!r}: {found}'
1852 sublibraries.append( found[0])
1853 break
1854 else:
1855 log2(f'Warning: can not find path of lib={lib!r} in libpaths={libpaths}')
1856 macos_patch( path_so, *sublibraries)
1857 1945
1858 #run(f'ls -l {path_so}', check=0) 1946 #run(f'ls -l {path_so}', check=0)
1859 #run(f'file {path_so}', check=0) 1947 #run(f'file {path_so}', check=0)
1860 1948
1949 _extensions_to_py_limited_api[os.path.abspath(path_so)] = py_limited_api
1950
1861 return path_so_leaf 1951 return path_so_leaf
1862 1952
1863 1953
1864 # Functions that might be useful. 1954 # Functions that might be useful.
1865 # 1955 #
1981 capture=1, 2071 capture=1,
1982 check=0 2072 check=0
1983 ) 2073 )
1984 if not e: 2074 if not e:
1985 branch = out.strip() 2075 branch = out.strip()
1986 log(f'git_info(): directory={directory!r} returning branch={branch!r} sha={sha!r} comment={comment!r}') 2076 log1(f'git_info(): directory={directory!r} returning branch={branch!r} sha={sha!r} comment={comment!r}')
1987 return sha, comment, diff, branch 2077 return sha, comment, diff, branch
1988 2078
1989 2079
1990 def git_items( directory, submodules=False): 2080 def git_items( directory, submodules=False):
1991 ''' 2081 '''
2025 ret.append(path) 2115 ret.append(path)
2026 return ret 2116 return ret
2027 2117
2028 2118
2029 def git_get( 2119 def git_get(
2030 remote,
2031 local, 2120 local,
2032 *, 2121 *,
2122 remote=None,
2033 branch=None, 2123 branch=None,
2124 tag=None,
2125 text=None,
2034 depth=1, 2126 depth=1,
2035 env_extra=None, 2127 env_extra=None,
2036 tag=None,
2037 update=True, 2128 update=True,
2038 submodules=True, 2129 submodules=True,
2039 default_remote=None,
2040 ): 2130 ):
2041 ''' 2131 '''
2042 Ensures that <local> is a git checkout (at either <tag>, or <branch> HEAD) 2132 Creates/updates local checkout <local> of remote repository and returns
2043 of a remote repository. 2133 absolute path of <local>.
2044 2134
2045 Exactly one of <branch> and <tag> must be specified, or <remote> must start 2135 If <text> is set but does not start with 'git:', it is assumed to be an up
2046 with 'git:' and match the syntax described below. 2136 to date local checkout, and we return absolute path of <text> without doing
2137 any git operations.
2047 2138
2048 Args: 2139 Args:
2140 local:
2141 Local directory. Created and/or updated using `git clone` and `git
2142 fetch` etc.
2049 remote: 2143 remote:
2050 Remote git repostitory, for example 2144 Remote git repostitory, for example
2051 'https://github.com/ArtifexSoftware/mupdf.git'. 2145 'https://github.com/ArtifexSoftware/mupdf.git'. Can be overridden
2146 by <text>.
2147 branch:
2148 Branch to use; can be overridden by <text>.
2149 tag:
2150 Tag to use; can be overridden by <text>.
2151 text:
2152 If None or empty:
2153 Ignored.
2052 2154
2053 If starts with 'git:', the remaining text should be a command-line 2155 If starts with 'git:':
2054 style string containing some or all of these args: 2156 The remaining text should be a command-line
2055 --branch <branch> 2157 style string containing some or all of these args:
2056 --tag <tag> 2158 --branch <branch>
2057 <remote> 2159 --tag <tag>
2058 These overrides <branch>, <tag> and <default_remote>. 2160 <remote>
2161 These overrides <branch>, <tag> and <remote>.
2162 Otherwise:
2163 <text> is assumed to be a local directory, and we simply return
2164 it as an absolute path without doing any git operations.
2059 2165
2060 For example these all clone/update/branch master of https://foo.bar/qwerty.git to local 2166 For example these all clone/update/branch master of https://foo.bar/qwerty.git to local
2061 checkout 'foo-local': 2167 checkout 'foo-local':
2062 2168
2063 git_get('https://foo.bar/qwerty.git', 'foo-local', branch='master') 2169 git_get('foo-local', remote='https://foo.bar/qwerty.git', branch='master')
2064 git_get('git:--branch master https://foo.bar/qwerty.git', 'foo-local') 2170 git_get('foo-local', text='git:--branch master https://foo.bar/qwerty.git')
2065 git_get('git:--branch master', 'foo-local', default_remote='https://foo.bar/qwerty.git') 2171 git_get('foo-local', text='git:--branch master', remote='https://foo.bar/qwerty.git')
2066 git_get('git:', 'foo-local', branch='master', default_remote='https://foo.bar/qwerty.git') 2172 git_get('foo-local', text='git:', branch='master', remote='https://foo.bar/qwerty.git')
2067
2068 local:
2069 Local directory. If <local>/.git exists, we attempt to run `git
2070 update` in it.
2071 branch:
2072 Branch to use. Is used as default if remote starts with 'git:'.
2073 depth: 2173 depth:
2074 Depth of local checkout when cloning and fetching, or None. 2174 Depth of local checkout when cloning and fetching, or None.
2075 env_extra: 2175 env_extra:
2076 Dict of extra name=value environment variables to use whenever we 2176 Dict of extra name=value environment variables to use whenever we
2077 run git. 2177 run git.
2078 tag:
2079 Tag to use. Is used as default if remote starts with 'git:'.
2080 update: 2178 update:
2081 If false we do not update existing repository. Might be useful if 2179 If false we do not update existing repository. Might be useful if
2082 testing without network access. 2180 testing without network access.
2083 submodules: 2181 submodules:
2084 If true, we clone with `--recursive --shallow-submodules` and run 2182 If true, we clone with `--recursive --shallow-submodules` and run
2085 `git submodule update --init --recursive` before returning. 2183 `git submodule update --init --recursive` before returning.
2086 default_remote:
2087 The remote URL if <remote> starts with 'git:' but does not specify
2088 the remote URL.
2089 ''' 2184 '''
2090 log0(f'{remote=} {local=} {branch=} {tag=}') 2185 log0(f'{remote=} {local=} {branch=} {tag=}')
2091 if remote.startswith('git:'): 2186
2092 remote0 = remote 2187 if text:
2093 args = iter(shlex.split(remote0[len('git:'):])) 2188 if text.startswith('git:'):
2094 remote = default_remote 2189 args = iter(shlex.split(text[len('git:'):]))
2095 while 1: 2190 while 1:
2096 try: 2191 try:
2097 arg = next(args) 2192 arg = next(args)
2098 except StopIteration: 2193 except StopIteration:
2099 break 2194 break
2100 if arg == '--branch': 2195 if arg == '--branch':
2101 branch = next(args) 2196 branch = next(args)
2102 tag = None 2197 tag = None
2103 elif arg == '--tag': 2198 elif arg == '--tag':
2104 tag == next(args) 2199 tag = next(args)
2105 branch = None 2200 branch = None
2106 else: 2201 else:
2107 remote = arg 2202 remote = arg
2108 assert remote, f'{default_remote=} and no remote specified in remote={remote0!r}.' 2203 assert remote, f'<remote> unset and no remote specified in {text=}.'
2109 assert branch or tag, f'{branch=} {tag=} and no branch/tag specified in remote={remote0!r}.' 2204 assert branch or tag, f'<branch> and <tag> unset and no branch/tag specified in {text=}.'
2205 else:
2206 log0(f'Using local directory {text!r}.')
2207 return os.path.abspath(text)
2110 2208
2111 assert (branch and not tag) or (not branch and tag), f'Must specify exactly one of <branch> and <tag>.' 2209 assert (branch and not tag) or (not branch and tag), f'Must specify exactly one of <branch> and <tag>; {branch=} {tag=}.'
2112 2210
2113 depth_arg = f' --depth {depth}' if depth else '' 2211 depth_arg = f' --depth {depth}' if depth else ''
2114 2212
2115 def do_update(): 2213 def do_update():
2116 # This seems to pull in the entire repository. 2214 # This seems to pull in the entire repository.
2117 log0(f'do_update(): attempting to update {local=}.') 2215 log0(f'do_update(): attempting to update {local=}.')
2118 # Remove any local changes. 2216 # Remove any local changes.
2119 run(f'cd {local} && git checkout .', env_extra=env_extra) 2217 run(f'cd {local} && git reset --hard', env_extra=env_extra)
2120 if tag: 2218 if tag:
2121 # `-u` avoids `fatal: Refusing to fetch into current branch`. 2219 # `-u` avoids `fatal: Refusing to fetch into current branch`.
2122 # Using '+' and `revs/tags/` prefix seems to avoid errors like: 2220 # Using '+' and `revs/tags/` prefix seems to avoid errors like:
2123 # error: cannot update ref 'refs/heads/v3.16.44': 2221 # error: cannot update ref 'refs/heads/v3.16.44':
2124 # trying to write non-commit object 2222 # trying to write non-commit object
2162 if submodules: 2260 if submodules:
2163 run(f'cd {local} && git submodule update --init --recursive', env_extra=env_extra) 2261 run(f'cd {local} && git submodule update --init --recursive', env_extra=env_extra)
2164 2262
2165 # Show sha of checkout. 2263 # Show sha of checkout.
2166 run( f'cd {local} && git show --pretty=oneline|head -n 1', check=False) 2264 run( f'cd {local} && git show --pretty=oneline|head -n 1', check=False)
2265 return os.path.abspath(local)
2167 2266
2168 2267
2169 def run( 2268 def run(
2170 command, 2269 command,
2171 *, 2270 *,
2450 ldflags2 = self.ldflags.replace(' -lcrypt ', ' ') 2549 ldflags2 = self.ldflags.replace(' -lcrypt ', ' ')
2451 if ldflags2 != self.ldflags: 2550 if ldflags2 != self.ldflags:
2452 log2(f'### Have removed `-lcrypt` from ldflags: {self.ldflags!r} -> {ldflags2!r}') 2551 log2(f'### Have removed `-lcrypt` from ldflags: {self.ldflags!r} -> {ldflags2!r}')
2453 self.ldflags = ldflags2 2552 self.ldflags = ldflags2
2454 2553
2455 log1(f'{self.includes=}') 2554 if 0:
2456 log1(f' {includes_=}') 2555 log1(f'{self.includes=}')
2457 log1(f'{self.ldflags=}') 2556 log1(f' {includes_=}')
2458 log1(f' {ldflags_=}') 2557 log1(f'{self.ldflags=}')
2558 log1(f' {ldflags_=}')
2459 2559
2460 2560
2461 def macos_add_cross_flags(command): 2561 def macos_add_cross_flags(command):
2462 ''' 2562 '''
2463 If running on MacOS and environment variables ARCHFLAGS is set 2563 If running on MacOS and environment variables ARCHFLAGS is set
2553 ''' 2653 '''
2554 #log(f'sys.maxsize={hex(sys.maxsize)}') 2654 #log(f'sys.maxsize={hex(sys.maxsize)}')
2555 return f'x{32 if sys.maxsize == 2**31 - 1 else 64}' 2655 return f'x{32 if sys.maxsize == 2**31 - 1 else 64}'
2556 2656
2557 2657
2558 def run_if( command, out, *prerequisites): 2658 def run_if( command, out, *prerequisites, caller=1):
2559 ''' 2659 '''
2560 Runs a command only if the output file is not up to date. 2660 Runs a command only if the output file is not up to date.
2561 2661
2562 Args: 2662 Args:
2563 command: 2663 command:
2583 >>> out = 'run_if_test_out' 2683 >>> out = 'run_if_test_out'
2584 >>> if os.path.exists( out): 2684 >>> if os.path.exists( out):
2585 ... os.remove( out) 2685 ... os.remove( out)
2586 >>> if os.path.exists( f'{out}.cmd'): 2686 >>> if os.path.exists( f'{out}.cmd'):
2587 ... os.remove( f'{out}.cmd') 2687 ... os.remove( f'{out}.cmd')
2588 >>> run_if( f'touch {out}', out) 2688 >>> run_if( f'touch {out}', out, caller=0)
2589 pipcl.py:run_if(): Running command because: File does not exist: 'run_if_test_out' 2689 pipcl.py:run_if(): Running command because: File does not exist: 'run_if_test_out'
2590 pipcl.py:run_if(): Running: touch run_if_test_out 2690 pipcl.py:run_if(): Running: touch run_if_test_out
2591 True 2691 True
2592 2692
2593 If we repeat, the output file will be up to date so the command is not run: 2693 If we repeat, the output file will be up to date so the command is not run:
2594 2694
2595 >>> run_if( f'touch {out}', out) 2695 >>> run_if( f'touch {out}', out, caller=0)
2596 pipcl.py:run_if(): Not running command because up to date: 'run_if_test_out' 2696 pipcl.py:run_if(): Not running command because up to date: 'run_if_test_out'
2597 2697
2598 If we change the command, the command is run: 2698 If we change the command, the command is run:
2599 2699
2600 >>> run_if( f'touch {out}', out) 2700 >>> run_if( f'touch {out};', out, caller=0)
2601 pipcl.py:run_if(): Running command because: Command has changed 2701 pipcl.py:run_if(): Running command because: Command has changed:
2602 pipcl.py:run_if(): Running: touch run_if_test_out 2702 pipcl.py:run_if(): @@ -1,2 +1,2 @@
2703 pipcl.py:run_if(): touch
2704 pipcl.py:run_if(): -run_if_test_out
2705 pipcl.py:run_if(): +run_if_test_out;
2706 pipcl.py:run_if():
2707 pipcl.py:run_if(): Running: touch run_if_test_out;
2603 True 2708 True
2604 2709
2605 If we add a prerequisite that is newer than the output, the command is run: 2710 If we add a prerequisite that is newer than the output, the command is run:
2606 2711
2607 >>> time.sleep(1) 2712 >>> time.sleep(1)
2608 >>> prerequisite = 'run_if_test_prerequisite' 2713 >>> prerequisite = 'run_if_test_prerequisite'
2609 >>> run( f'touch {prerequisite}', caller=0) 2714 >>> run( f'touch {prerequisite}', caller=0)
2610 pipcl.py:run(): Running: touch run_if_test_prerequisite 2715 pipcl.py:run(): Running: touch run_if_test_prerequisite
2611 >>> run_if( f'touch {out}', out, prerequisite) 2716 >>> run_if( f'touch {out}', out, prerequisite, caller=0)
2612 pipcl.py:run_if(): Running command because: Prerequisite is new: 'run_if_test_prerequisite' 2717 pipcl.py:run_if(): Running command because: Command has changed:
2718 pipcl.py:run_if(): @@ -1,2 +1,2 @@
2719 pipcl.py:run_if(): touch
2720 pipcl.py:run_if(): -run_if_test_out;
2721 pipcl.py:run_if(): +run_if_test_out
2722 pipcl.py:run_if():
2613 pipcl.py:run_if(): Running: touch run_if_test_out 2723 pipcl.py:run_if(): Running: touch run_if_test_out
2614 True 2724 True
2615 2725
2616 If we repeat, the output will be newer than the prerequisite, so the 2726 If we repeat, the output will be newer than the prerequisite, so the
2617 command is not run: 2727 command is not run:
2618 2728
2619 >>> run_if( f'touch {out}', out, prerequisite) 2729 >>> run_if( f'touch {out}', out, prerequisite, caller=0)
2620 pipcl.py:run_if(): Not running command because up to date: 'run_if_test_out' 2730 pipcl.py:run_if(): Not running command because up to date: 'run_if_test_out'
2621 ''' 2731 '''
2622 doit = False 2732 doit = False
2623 cmd_path = f'{out}.cmd' 2733 cmd_path = f'{out}.cmd'
2624 2734
2631 if os.path.isfile( cmd_path): 2741 if os.path.isfile( cmd_path):
2632 with open( cmd_path) as f: 2742 with open( cmd_path) as f:
2633 cmd = f.read() 2743 cmd = f.read()
2634 else: 2744 else:
2635 cmd = None 2745 cmd = None
2636 if command != cmd: 2746 cmd_args = shlex.split(cmd or '')
2747 command_args = shlex.split(command or '')
2748 if command_args != cmd_args:
2637 if cmd is None: 2749 if cmd is None:
2638 doit = 'No previous command stored' 2750 doit = 'No previous command stored'
2639 else: 2751 else:
2640 doit = f'Command has changed' 2752 doit = f'Command has changed'
2641 if 0: 2753 if 0:
2642 doit += f': {cmd!r} => {command!r}' 2754 doit += f':\n {cmd!r}\n {command!r}'
2755 if 0:
2756 doit += f'\nbefore:\n'
2757 doit += textwrap.indent(cmd, ' ')
2758 doit += f'\nafter:\n'
2759 doit += textwrap.indent(command, ' ')
2760 if 1:
2761 # Show diff based on commands split into pseudo lines by
2762 # shlex.split().
2763 doit += ':\n'
2764 lines = difflib.unified_diff(
2765 cmd.split(),
2766 command.split(),
2767 lineterm='',
2768 )
2769 # Skip initial lines.
2770 assert next(lines) == '--- '
2771 assert next(lines) == '+++ '
2772 for line in lines:
2773 doit += f' {line}\n'
2643 2774
2644 if not doit: 2775 if not doit:
2645 # See whether any prerequisites are newer than target. 2776 # See whether any prerequisites are newer than target.
2646 def _make_prerequisites(p): 2777 def _make_prerequisites(p):
2647 if isinstance( p, (list, tuple)): 2778 if isinstance( p, (list, tuple)):
2650 return [p] 2781 return [p]
2651 prerequisites_all = list() 2782 prerequisites_all = list()
2652 for p in prerequisites: 2783 for p in prerequisites:
2653 prerequisites_all += _make_prerequisites( p) 2784 prerequisites_all += _make_prerequisites( p)
2654 if 0: 2785 if 0:
2655 log2( 'prerequisites_all:') 2786 log2( 'prerequisites_all:', caller=caller+1)
2656 for i in prerequisites_all: 2787 for i in prerequisites_all:
2657 log2( f' {i!r}') 2788 log2( f' {i!r}', caller=caller+1)
2658 pre_mtime = 0 2789 pre_mtime = 0
2659 pre_path = None 2790 pre_path = None
2660 for prerequisite in prerequisites_all: 2791 for prerequisite in prerequisites_all:
2661 if isinstance( prerequisite, str): 2792 if isinstance( prerequisite, str):
2662 mtime = _fs_mtime_newest( prerequisite) 2793 mtime = _fs_mtime_newest( prerequisite)
2668 elif prerequisite: 2799 elif prerequisite:
2669 doit = str(prerequisite) 2800 doit = str(prerequisite)
2670 break 2801 break
2671 if not doit: 2802 if not doit:
2672 if pre_mtime > out_mtime: 2803 if pre_mtime > out_mtime:
2673 doit = f'Prerequisite is new: {pre_path!r}' 2804 doit = f'Prerequisite is new: {os.path.abspath(pre_path)!r}'
2674 2805
2675 if doit: 2806 if doit:
2676 # Remove `cmd_path` before we run the command, so any failure 2807 # Remove `cmd_path` before we run the command, so any failure
2677 # will force rerun next time. 2808 # will force rerun next time.
2678 # 2809 #
2679 try: 2810 try:
2680 os.remove( cmd_path) 2811 os.remove( cmd_path)
2681 except Exception: 2812 except Exception:
2682 pass 2813 pass
2683 log1( f'Running command because: {doit}') 2814 log1( f'Running command because: {doit}', caller=caller+1)
2684 2815
2685 run( command) 2816 run( command, caller=caller+1)
2686 2817
2687 # Write the command we ran, into `cmd_path`. 2818 # Write the command we ran, into `cmd_path`.
2688 with open( cmd_path, 'w') as f: 2819 with open( cmd_path, 'w') as f:
2689 f.write( command) 2820 f.write( command)
2690 return True 2821 return True
2691 else: 2822 else:
2692 log1( f'Not running command because up to date: {out!r}') 2823 log1( f'Not running command because up to date: {out!r}', caller=caller+1)
2693 2824
2694 if 0: 2825 if 0:
2695 log2( f'out_mtime={time.ctime(out_mtime)} pre_mtime={time.ctime(pre_mtime)}.' 2826 log2( f'out_mtime={time.ctime(out_mtime)} pre_mtime={time.ctime(pre_mtime)}.'
2696 f' pre_path={pre_path!r}: returning {ret!r}.' 2827 f' pre_path={pre_path!r}: returning {ret!r}.'
2697 ) 2828 )
2759 def _normalise(name): 2890 def _normalise(name):
2760 # https://packaging.python.org/en/latest/specifications/name-normalization/#name-normalization 2891 # https://packaging.python.org/en/latest/specifications/name-normalization/#name-normalization
2761 return re.sub(r"[-_.]+", "-", name).lower() 2892 return re.sub(r"[-_.]+", "-", name).lower()
2762 2893
2763 2894
2895 def _normalise2(name):
2896 # https://packaging.python.org/en/latest/specifications/binary-distribution-format/
2897 return _normalise(name).replace('-', '_')
2898
2899
2764 def _assert_version_pep_440(version): 2900 def _assert_version_pep_440(version):
2765 assert re.match( 2901 assert re.match(
2766 r'^([1-9][0-9]*!)?(0|[1-9][0-9]*)(\.(0|[1-9][0-9]*))*((a|b|rc)(0|[1-9][0-9]*))?(\.post(0|[1-9][0-9]*))?(\.dev(0|[1-9][0-9]*))?$', 2902 r'^([1-9][0-9]*!)?(0|[1-9][0-9]*)(\.(0|[1-9][0-9]*))*((a|b|rc)(0|[1-9][0-9]*))?(\.post(0|[1-9][0-9]*))?(\.dev(0|[1-9][0-9]*))?$',
2767 version, 2903 version,
2768 ), \ 2904 ), \
2787 ''' 2923 '''
2788 Sets whether to include line numbers; helps with doctest. 2924 Sets whether to include line numbers; helps with doctest.
2789 ''' 2925 '''
2790 global g_log_line_numbers 2926 global g_log_line_numbers
2791 g_log_line_numbers = bool(yes) 2927 g_log_line_numbers = bool(yes)
2928
2929 def log(text='', caller=1):
2930 _log(text, 0, caller+1)
2792 2931
2793 def log0(text='', caller=1): 2932 def log0(text='', caller=1):
2794 _log(text, 0, caller+1) 2933 _log(text, 0, caller+1)
2795 2934
2796 def log1(text='', caller=1): 2935 def log1(text='', caller=1):
2811 print(f'{filename}:{fr.lineno}:{fr.function}(): {line}', file=sys.stdout, flush=1) 2950 print(f'{filename}:{fr.lineno}:{fr.function}(): {line}', file=sys.stdout, flush=1)
2812 else: 2951 else:
2813 print(f'{filename}:{fr.function}(): {line}', file=sys.stdout, flush=1) 2952 print(f'{filename}:{fr.function}(): {line}', file=sys.stdout, flush=1)
2814 2953
2815 2954
2816 def relpath(path, start=None): 2955 def relpath(path, start=None, allow_up=True):
2817 ''' 2956 '''
2818 A safe alternative to os.path.relpath(), avoiding an exception on Windows 2957 A safe alternative to os.path.relpath(), avoiding an exception on Windows
2819 if the drive needs to change - in this case we use os.path.abspath(). 2958 if the drive needs to change - in this case we use os.path.abspath().
2959
2960 Args:
2961 path:
2962 Path to be processed.
2963 start:
2964 Start directory or current directory if None.
2965 allow_up:
2966 If false we return absolute path is <path> is not within <start>.
2820 ''' 2967 '''
2821 if windows(): 2968 if windows():
2822 try: 2969 try:
2823 return os.path.relpath(path, start) 2970 ret = os.path.relpath(path, start)
2824 except ValueError: 2971 except ValueError:
2825 # os.path.relpath() fails if trying to change drives. 2972 # os.path.relpath() fails if trying to change drives.
2826 return os.path.abspath(path) 2973 ret = os.path.abspath(path)
2827 else: 2974 else:
2828 return os.path.relpath(path, start) 2975 ret = os.path.relpath(path, start)
2976 if not allow_up and ret.startswith('../') or ret.startswith('..\\'):
2977 ret = os.path.abspath(path)
2978 return ret
2829 2979
2830 2980
2831 def _so_suffix(use_so_versioning=True): 2981 def _so_suffix(use_so_versioning=True):
2832 ''' 2982 '''
2833 Filename suffix for shared libraries is defined in pep-3149. The 2983 Filename suffix for shared libraries is defined in pep-3149. The
2979 ret = list() 3129 ret = list()
2980 items = self._items() 3130 items = self._items()
2981 for path, id_ in items.items(): 3131 for path, id_ in items.items():
2982 id0 = self.items0.get(path) 3132 id0 = self.items0.get(path)
2983 if id0 != id_: 3133 if id0 != id_:
2984 #mtime0, hash0 = id0
2985 #mtime1, hash1 = id_
2986 #log0(f'New/modified file {path=}.')
2987 #log0(f' {mtime0=} {"==" if mtime0==mtime1 else "!="} {mtime1=}.')
2988 #log0(f' {hash0=} {"==" if hash0==hash1 else "!="} {hash1=}.')
2989 ret.append(path) 3134 ret.append(path)
2990 return ret 3135 return ret
3136 def get_n(self, n):
3137 '''
3138 Returns new files matching <glob_pattern>, asserting that there are
3139 exactly <n>.
3140 '''
3141 ret = self.get()
3142 assert len(ret) == n, f'{len(ret)=}: {ret}'
3143 return ret
2991 def get_one(self): 3144 def get_one(self):
2992 ''' 3145 '''
2993 Returns new match of <glob_pattern>, asserting that there is exactly 3146 Returns new match of <glob_pattern>, asserting that there is exactly
2994 one. 3147 one.
2995 ''' 3148 '''
2996 ret = self.get() 3149 return self.get_n(1)[0]
2997 assert len(ret) == 1, f'{len(ret)=}'
2998 return ret[0]
2999 def _file_id(self, path): 3150 def _file_id(self, path):
3000 mtime = os.stat(path).st_mtime 3151 mtime = os.stat(path).st_mtime
3001 with open(path, 'rb') as f: 3152 with open(path, 'rb') as f:
3002 content = f.read() 3153 content = f.read()
3003 hash_ = hashlib.md5(content).digest() 3154 hash_ = hashlib.md5(content).digest()
3023 3174
3024 Otherwise we simply return <swig>. 3175 Otherwise we simply return <swig>.
3025 3176
3026 Args: 3177 Args:
3027 swig: 3178 swig:
3028 If starts with 'git:', passed as <remote> arg to git_remote(). 3179 If starts with 'git:', passed as <text> arg to git_get().
3029 quick: 3180 quick:
3030 If true, we do not update/build local checkout if the binary is 3181 If true, we do not update/build local checkout if the binary is
3031 already present. 3182 already present.
3032 swig_local: 3183 swig_local:
3033 path to use for checkout. 3184 path to use for checkout.
3034 ''' 3185 '''
3035 if swig and swig.startswith('git:'): 3186 if swig and swig.startswith('git:'):
3036 assert platform.system() != 'Windows' 3187 assert platform.system() != 'Windows', f'Cannot build swig on Windows.'
3037 swig_local = os.path.abspath(swig_local) 3188 # Note that {swig_local}/install/bin/swig doesn't work on MacOS because
3038 # Note that {swig_local}/install/bin/swig doesn't work on MacoS because
3039 # {swig_local}/INSTALL is a file and the fs is case-insensitive. 3189 # {swig_local}/INSTALL is a file and the fs is case-insensitive.
3040 swig_binary = f'{swig_local}/install-dir/bin/swig' 3190 swig_binary = f'{swig_local}/install-dir/bin/swig'
3041 if quick and os.path.isfile(swig_binary): 3191 if quick and os.path.isfile(swig_binary):
3042 log1(f'{quick=} and {swig_binary=} already exists, so not downloading/building.') 3192 log1(f'{quick=} and {swig_binary=} already exists, so not downloading/building.')
3043 else: 3193 else:
3044 # Clone swig. 3194 # Clone swig.
3045 swig_env_extra = None 3195 swig_env_extra = None
3046 git_get( 3196 swig_local = git_get(
3047 swig,
3048 swig_local, 3197 swig_local,
3049 default_remote='https://github.com/swig/swig.git', 3198 text=swig,
3199 remote='https://github.com/swig/swig.git',
3050 branch='master', 3200 branch='master',
3051 ) 3201 )
3052 if darwin(): 3202 if darwin():
3053 run(f'brew install automake') 3203 run(f'brew install automake')
3054 run(f'brew install pcre2') 3204 run(f'brew install pcre2')
3059 # > parallel can cause all kinds of trouble. 3209 # > parallel can cause all kinds of trouble.
3060 # > 3210 # >
3061 # > If you need to have bison first in your PATH, run: 3211 # > If you need to have bison first in your PATH, run:
3062 # > echo 'export PATH="/opt/homebrew/opt/bison/bin:$PATH"' >> ~/.zshrc 3212 # > echo 'export PATH="/opt/homebrew/opt/bison/bin:$PATH"' >> ~/.zshrc
3063 # 3213 #
3064 run(f'brew install bison') 3214 swig_env_extra = dict()
3065 PATH = os.environ['PATH'] 3215 macos_add_brew_path('bison', swig_env_extra)
3066 PATH = f'/opt/homebrew/opt/bison/bin:{PATH}' 3216 run(f'which bison')
3067 swig_env_extra = dict(PATH=PATH) 3217 run(f'which bison', env_extra=swig_env_extra)
3068 # Build swig. 3218 # Build swig.
3069 run(f'cd {swig_local} && ./autogen.sh', env_extra=swig_env_extra) 3219 run(f'cd {swig_local} && ./autogen.sh', env_extra=swig_env_extra)
3070 run(f'cd {swig_local} && ./configure --prefix={swig_local}/install-dir', env_extra=swig_env_extra) 3220 run(f'cd {swig_local} && ./configure --prefix={swig_local}/install-dir', env_extra=swig_env_extra)
3071 run(f'cd {swig_local} && make', env_extra=swig_env_extra) 3221 run(f'cd {swig_local} && make', env_extra=swig_env_extra)
3072 run(f'cd {swig_local} && make install', env_extra=swig_env_extra) 3222 run(f'cd {swig_local} && make install', env_extra=swig_env_extra)
3073 assert os.path.isfile(swig_binary) 3223 assert os.path.isfile(swig_binary)
3074 return swig_binary 3224 return swig_binary
3075 else: 3225 else:
3076 return swig 3226 return swig
3227
3228
3229 def macos_add_brew_path(package, env=None, gnubin=True):
3230 '''
3231 Adds path(s) for Brew <package>'s binaries to env['PATH'].
3232
3233 Args:
3234 package:
3235 Name of package. We get <package_root> of installed package by
3236 running `brew --prefix <package>`.
3237 env:
3238 The environment dict to modify. If None we use os.environ. If PATH
3239 is not in <env>, we first copy os.environ['PATH'] into <env>.
3240 gnubin:
3241 If true, we also add path to gnu binaries if it exists,
3242 <package_root>/libexe/gnubin.
3243 '''
3244 if not darwin():
3245 return
3246 if env is None:
3247 env = os.environ
3248 if 'PATH' not in env:
3249 env['PATH'] = os.environ['PATH']
3250 package_root = run(f'brew --prefix {package}', capture=1).strip()
3251 def add(path):
3252 if os.path.isdir(path):
3253 log1(f'Adding to $PATH: {path}')
3254 PATH = env['PATH']
3255 env['PATH'] = f'{path}:{PATH}'
3256 add(f'{package_root}/bin')
3257 if gnubin:
3258 add(f'{package_root}/libexec/gnubin')
3077 3259
3078 3260
3079 def _show_dict(d): 3261 def _show_dict(d):
3080 ret = '' 3262 ret = ''
3081 for n in sorted(d.keys()): 3263 for n in sorted(d.keys()):
3117 ldflags_ = f'-L {ldflags_}' 3299 ldflags_ = f'-L {ldflags_}'
3118 includes_ = ' '.join(includes_) 3300 includes_ = ' '.join(includes_)
3119 return includes_, ldflags_ 3301 return includes_, ldflags_
3120 3302
3121 3303
3304 def venv_in(path=None):
3305 '''
3306 If path is None, returns true if we are in a venv. Otherwise returns true
3307 only if we are in venv <path>.
3308 '''
3309 if path:
3310 return os.path.abspath(sys.prefix) == os.path.abspath(path)
3311 else:
3312 return sys.prefix != sys.base_prefix
3313
3314
3315 def venv_run(args, path, recreate=True, clean=False):
3316 '''
3317 Runs Python command inside venv and returns termination code.
3318
3319 Args:
3320 args:
3321 List of args or string command.
3322 path:
3323 Path of venv directory.
3324 recreate:
3325 If false we do not run `<sys.executable> -m venv <path>` if <path>
3326 already exists. This avoids a delay in the common case where <path>
3327 is already set up, but fails if <path> exists but does not contain
3328 a valid venv.
3329 clean:
3330 If true we first delete <path>.
3331 '''
3332 if clean:
3333 log(f'Removing any existing venv {path}.')
3334 assert path.startswith('venv-')
3335 shutil.rmtree(path, ignore_errors=1)
3336 if recreate or not os.path.isdir(path):
3337 run(f'{sys.executable} -m venv {path}')
3338
3339 if isinstance(args, str):
3340 args_string = args
3341 elif platform.system() == 'Windows':
3342 # shlex not reliable on Windows so we use Use crude quoting with "...".
3343 args_string = ''
3344 for i, arg in enumerate(args):
3345 assert '"' not in arg
3346 if i:
3347 args_string += ' '
3348 args_string += f'"{arg}"'
3349 else:
3350 args_string = shlex.join(args)
3351
3352 if platform.system() == 'Windows':
3353 command = f'{path}\\Scripts\\activate && python {args_string}'
3354 else:
3355 command = f'. {path}/bin/activate && python {args_string}'
3356 e = run(command, check=0)
3357 return e
3358
3359
3122 if __name__ == '__main__': 3360 if __name__ == '__main__':
3123 # Internal-only limited command line support, used if 3361 # Internal-only limited command line support, used if
3124 # graal_legacy_python_config is true. 3362 # graal_legacy_python_config is true.
3125 # 3363 #
3126 includes, ldflags = sysconfig_python_flags() 3364 includes, ldflags = sysconfig_python_flags()
3127 if sys.argv[1:] == ['--graal-legacy-python-config', '--includes']: 3365 if sys.argv[1] == '--doctest':
3366 import doctest
3367 if sys.argv[2:]:
3368 for f in sys.argv[2:]:
3369 ff = globals()[f]
3370 doctest.run_docstring_examples(ff, globals())
3371 else:
3372 doctest.testmod(None)
3373 elif sys.argv[1:] == ['--graal-legacy-python-config', '--includes']:
3128 print(includes) 3374 print(includes)
3129 elif sys.argv[1:] == ['--graal-legacy-python-config', '--ldflags']: 3375 elif sys.argv[1:] == ['--graal-legacy-python-config', '--ldflags']:
3130 print(ldflags) 3376 print(ldflags)
3131 else: 3377 else:
3132 assert 0, f'Expected `--graal-legacy-python-config --includes|--ldflags` but {sys.argv=}' 3378 assert 0, f'Expected `--graal-legacy-python-config --includes|--ldflags` but {sys.argv=}'