comparison pipcl.py @ 41:71bcc18e306f

MERGE: New upstream PyMuPDF v1.26.5 including MuPDF v1.26.10 BUGS: Needs some additional changes yet. Not yet tested.
author Franz Glasner <fzglas.hg@dom66.de>
date Sat, 11 Oct 2025 15:24:40 +0200
parents c4daa0c83d64 a6bc019ac0b2
children 202a1f38a622
comparison
equal deleted inserted replaced
38:8934ac156ef5 41:71bcc18e306f
1 ''' 1 '''
2 Python packaging operations, including PEP-517 support, for use by a `setup.py` 2 Python packaging operations, including PEP-517 support, for use by a `setup.py`
3 script. 3 script.
4 4
5 The intention is to take care of as many packaging details as possible so that 5 Overview:
6 setup.py contains only project-specific information, while also giving as much 6
7 flexibility as possible. 7 The intention is to take care of as many packaging details as possible so
8 8 that setup.py contains only project-specific information, while also giving
9 For example we provide a function `build_extension()` that can be used to build 9 as much flexibility as possible.
10 a SWIG extension, but we also give access to the located compiler/linker so 10
11 that a `setup.py` script can take over the details itself. 11 For example we provide a function `build_extension()` that can be used
12 12 to build a SWIG extension, but we also give access to the located
13 Run doctests with: `python -m doctest pipcl.py` 13 compiler/linker so that a `setup.py` script can take over the details
14 14 itself.
15 For Graal we require that PIPCL_GRAAL_PYTHON is set to non-graal Python (we 15
16 build for non-graal except with Graal Python's include paths and library 16 Doctests:
17 directory). 17 Doctest strings are provided in some comments.
18
19 Test in the usual way with:
20 python -m doctest pipcl.py
21
22 Test specific functions/classes with:
23 python pipcl.py --doctest run_if ...
24
25 If no functions or classes are specified, this tests everything.
26
27 Graal:
28 For Graal we require that PIPCL_GRAAL_PYTHON is set to non-graal Python (we
29 build for non-graal except with Graal Python's include paths and library
30 directory).
18 ''' 31 '''
19 32
20 import base64 33 import base64
21 import codecs 34 import codecs
35 import difflib
22 import glob 36 import glob
23 import hashlib 37 import hashlib
24 import inspect 38 import inspect
25 import io 39 import io
26 import os 40 import os
52 66
53 We also support basic command line handling for use 67 We also support basic command line handling for use
54 with a legacy (pre-PEP-517) pip, as implemented 68 with a legacy (pre-PEP-517) pip, as implemented
55 by legacy distutils/setuptools and described in: 69 by legacy distutils/setuptools and described in:
56 https://pip.pypa.io/en/stable/reference/build-system/setup-py/ 70 https://pip.pypa.io/en/stable/reference/build-system/setup-py/
71
72 The file pyproject.toml must exist; this is checked if/when fn_build() is
73 called.
57 74
58 Here is a `doctest` example of using pipcl to create a SWIG extension 75 Here is a `doctest` example of using pipcl to create a SWIG extension
59 module. Requires `swig`. 76 module. Requires `swig`.
60 77
61 Create an empty test directory: 78 Create an empty test directory:
319 336
320 wheel_compression = zipfile.ZIP_DEFLATED, 337 wheel_compression = zipfile.ZIP_DEFLATED,
321 wheel_compresslevel = None, 338 wheel_compresslevel = None,
322 ): 339 ):
323 ''' 340 '''
324 The initial args before `root` define the package 341 The initial args before `entry_points` define the
325 metadata and closely follow the definitions in: 342 package metadata and closely follow the definitions in:
326 https://packaging.python.org/specifications/core-metadata/ 343 https://packaging.python.org/specifications/core-metadata/
327 344
328 Args: 345 Args:
329 346
330 name: 347 name:
348 Used for metadata `Name`.
331 A string, the name of the Python package. 349 A string, the name of the Python package.
332 version: 350 version:
351 Used for metadata `Version`.
333 A string, the version of the Python package. Also see PEP-440 352 A string, the version of the Python package. Also see PEP-440
334 `Version Identification and Dependency Specification`. 353 `Version Identification and Dependency Specification`.
335 platform: 354 platform:
355 Used for metadata `Platform`.
336 A string or list of strings. 356 A string or list of strings.
337 supported_platform: 357 supported_platform:
358 Used for metadata `Supported-Platform`.
338 A string or list of strings. 359 A string or list of strings.
339 summary: 360 summary:
361 Used for metadata `Summary`.
340 A string, short description of the package. 362 A string, short description of the package.
341 description: 363 description:
364 Used for metadata `Description`.
342 A string. If contains newlines, a detailed description of the 365 A string. If contains newlines, a detailed description of the
343 package. Otherwise the path of a file containing the detailed 366 package. Otherwise the path of a file containing the detailed
344 description of the package. 367 description of the package.
345 description_content_type: 368 description_content_type:
369 Used for metadata `Description-Content-Type`.
346 A string describing markup of `description` arg. For example 370 A string describing markup of `description` arg. For example
347 `text/markdown; variant=GFM`. 371 `text/markdown; variant=GFM`.
348 keywords: 372 keywords:
373 Used for metadata `Keywords`.
349 A string containing comma-separated keywords. 374 A string containing comma-separated keywords.
350 home_page: 375 home_page:
376 Used for metadata `Home-page`.
351 URL of home page. 377 URL of home page.
352 download_url: 378 download_url:
379 Used for metadata `Download-URL`.
353 Where this version can be downloaded from. 380 Where this version can be downloaded from.
354 author: 381 author:
382 Used for metadata `Author`.
355 Author. 383 Author.
356 author_email: 384 author_email:
385 Used for metadata `Author-email`.
357 Author email. 386 Author email.
358 maintainer: 387 maintainer:
388 Used for metadata `Maintainer`.
359 Maintainer. 389 Maintainer.
360 maintainer_email: 390 maintainer_email:
391 Used for metadata `Maintainer-email`.
361 Maintainer email. 392 Maintainer email.
362 license: 393 license:
394 Used for metadata `License`.
363 A string containing the license text. Written into metadata 395 A string containing the license text. Written into metadata
364 file `COPYING`. Is also written into metadata itself if not 396 file `COPYING`. Is also written into metadata itself if not
365 multi-line. 397 multi-line.
366 classifier: 398 classifier:
399 Used for metadata `Classifier`.
367 A string or list of strings. Also see: 400 A string or list of strings. Also see:
368 401
369 * https://pypi.org/pypi?%3Aaction=list_classifiers 402 * https://pypi.org/pypi?%3Aaction=list_classifiers
370 * https://pypi.org/classifiers/ 403 * https://pypi.org/classifiers/
371 404
372 requires_dist: 405 requires_dist:
373 A string or list of strings. None items are ignored. Also see PEP-508. 406 Used for metadata `Requires-Dist`.
407 A string or list of strings, Python packages required
408 at runtime. None items are ignored.
374 requires_python: 409 requires_python:
410 Used for metadata `Requires-Python`.
375 A string or list of strings. 411 A string or list of strings.
376 requires_external: 412 requires_external:
413 Used for metadata `Requires-External`.
377 A string or list of strings. 414 A string or list of strings.
378 project_url: 415 project_url:
379 A string or list of strings, each of the form: `{name}, {url}`. 416 Used for metadata `Project-URL`.
417 A string or list of strings, each of the form: `{name},
418 {url}`.
380 provides_extra: 419 provides_extra:
420 Used for metadata `Provides-Extra`.
381 A string or list of strings. 421 A string or list of strings.
382 422
383 entry_points: 423 entry_points:
384 String or dict specifying *.dist-info/entry_points.txt, for 424 String or dict specifying *.dist-info/entry_points.txt, for
385 example: 425 example:
413 be the path to a file; a relative path is treated as relative 453 be the path to a file; a relative path is treated as relative
414 to `root`. If a `bytes` it is the contents of the file to be 454 to `root`. If a `bytes` it is the contents of the file to be
415 added. 455 added.
416 456
417 `to_` identifies what the file should be called within a wheel 457 `to_` identifies what the file should be called within a wheel
418 or when installing. If `to_` ends with `/`, the leaf of `from_` 458 or when installing. If `to_` is empty or `/` we set it to the
419 is appended to it (and `from_` must not be a `bytes`). 459 leaf of `from_` (`from_` must not be a `bytes`) - i.e. we place
460 the file in the root directory of the wheel; otherwise if
461 `to_` ends with `/` the leaf of `from_` is appended to it (and
462 `from_` must not be a `bytes`).
420 463
421 Initial `$dist-info/` in `_to` is replaced by 464 Initial `$dist-info/` in `_to` is replaced by
422 `{name}-{version}.dist-info/`; this is useful for license files 465 `{name}-{version}.dist-info/`; this is useful for license files
423 etc. 466 etc.
424 467
437 we copy `from_` to `{sitepackages}/{to_}`, where 480 we copy `from_` to `{sitepackages}/{to_}`, where
438 `sitepackages` is the installation directory, the 481 `sitepackages` is the installation directory, the
439 default being `sysconfig.get_path('platlib')` e.g. 482 default being `sysconfig.get_path('platlib')` e.g.
440 `myvenv/lib/python3.9/site-packages/`. 483 `myvenv/lib/python3.9/site-packages/`.
441 484
485 When calling this function, we assert that the file
486 pyproject.toml exists in the current directory. (We do this
487 here rather than in pipcl.Package's constructor, as otherwise
488 importing setup.py from non-package-related code could fail.)
489
442 fn_clean: 490 fn_clean:
443 A function taking a single arg `all_` that cleans generated 491 A function taking a single arg `all_` that cleans generated
444 files. `all_` is true iff `--all` is in argv. 492 files. `all_` is true iff `--all` is in argv.
445 493
446 For safety and convenience, can also returns a list of 494 For safety and convenience, can also returns a list of
455 as returned by `fn_build`. 503 as returned by `fn_build`.
456 504
457 It can be convenient to use `pipcl.git_items()`. 505 It can be convenient to use `pipcl.git_items()`.
458 506
459 The specification for sdists requires that the list contains 507 The specification for sdists requires that the list contains
460 `pyproject.toml`; we enforce this with a diagnostic rather than 508 `pyproject.toml`; we enforce this with a Python assert.
461 raising an exception, to allow legacy command-line usage.
462 509
463 tag_python: 510 tag_python:
464 First element of wheel tag defined in PEP-425. If None we use 511 First element of wheel tag defined in PEP-425. If None we use
465 `cp{version}`. 512 `cp{version}`.
466 513
526 assert_str_or_multi( requires_dist) 573 assert_str_or_multi( requires_dist)
527 assert_str( requires_python) 574 assert_str( requires_python)
528 assert_str_or_multi( requires_external) 575 assert_str_or_multi( requires_external)
529 assert_str_or_multi( project_url) 576 assert_str_or_multi( project_url)
530 assert_str_or_multi( provides_extra) 577 assert_str_or_multi( provides_extra)
578
579 assert re.match('^([A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9])\\Z', name, re.IGNORECASE), (
580 f'Invalid package name'
581 f' (https://packaging.python.org/en/latest/specifications/name-normalization/)'
582 f': {name!r}'
583 )
531 584
532 # https://packaging.python.org/en/latest/specifications/core-metadata/. 585 # https://packaging.python.org/en/latest/specifications/core-metadata/.
533 assert re.match('([A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9])$', name, re.IGNORECASE), \ 586 assert re.match('([A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9])$', name, re.IGNORECASE), \
534 f'Bad name: {name!r}' 587 f'Bad name: {name!r}'
535 588
600 f' wheel_directory={wheel_directory!r}' 653 f' wheel_directory={wheel_directory!r}'
601 f' config_settings={config_settings!r}' 654 f' config_settings={config_settings!r}'
602 f' metadata_directory={metadata_directory!r}' 655 f' metadata_directory={metadata_directory!r}'
603 ) 656 )
604 657
605 if sys.implementation.name == 'graalpy': 658 if os.environ.get('CIBUILDWHEEL') == '1':
659 # Don't special-case graal builds when running under cibuildwheel.
660 pass
661 elif sys.implementation.name == 'graalpy':
606 # We build for Graal by building a native Python wheel with Graal 662 # We build for Graal by building a native Python wheel with Graal
607 # Python's include paths and library directory. We then rename the 663 # Python's include paths and library directory. We then rename the
608 # wheel to contain graal's tag etc. 664 # wheel to contain graal's tag etc.
609 # 665 #
610 log0(f'### Graal build: deferring to cpython.') 666 log0(f'### Graal build: deferring to cpython.')
752 if inspect.signature(self.fn_sdist).parameters: 808 if inspect.signature(self.fn_sdist).parameters:
753 items = self.fn_sdist(config_settings) 809 items = self.fn_sdist(config_settings)
754 else: 810 else:
755 items = self.fn_sdist() 811 items = self.fn_sdist()
756 812
757 prefix = f'{_normalise(self.name)}-{self.version}' 813 prefix = f'{_normalise2(self.name)}-{self.version}'
758 os.makedirs(sdist_directory, exist_ok=True) 814 os.makedirs(sdist_directory, exist_ok=True)
759 tarpath = f'{sdist_directory}/{prefix}.tar.gz' 815 tarpath = f'{sdist_directory}/{prefix}.tar.gz'
760 log2(f'Creating sdist: {tarpath}') 816 log2(f'Creating sdist: {tarpath}')
761 817
762 with tarfile.open(tarpath, 'w:gz') as tar: 818 with tarfile.open(tarpath, 'w:gz') as tar:
808 if from_.startswith(f'{os.path.abspath(sdist_directory)}/'): 864 if from_.startswith(f'{os.path.abspath(sdist_directory)}/'):
809 # Source files should not be inside <sdist_directory>. 865 # Source files should not be inside <sdist_directory>.
810 assert 0, f'Path is inside sdist_directory={sdist_directory}: {from_!r}' 866 assert 0, f'Path is inside sdist_directory={sdist_directory}: {from_!r}'
811 assert os.path.exists(from_), f'Path does not exist: {from_!r}' 867 assert os.path.exists(from_), f'Path does not exist: {from_!r}'
812 assert os.path.isfile(from_), f'Path is not a file: {from_!r}' 868 assert os.path.isfile(from_), f'Path is not a file: {from_!r}'
813 if to_rel == 'pyproject.toml':
814 found_pyproject_toml = True
815 add(from_, to_rel) 869 add(from_, to_rel)
816 870 if to_rel == 'pyproject.toml':
817 if not found_pyproject_toml: 871 found_pyproject_toml = True
818 log0(f'Warning: no pyproject.toml specified.') 872
873 assert found_pyproject_toml, f'Cannot create sdist because file not specified: pyproject.toml'
819 874
820 # Always add a PKG-INFO file. 875 # Always add a PKG-INFO file.
821 add_string(self._metainfo(), 'PKG-INFO') 876 add_string(self._metainfo(), 'PKG-INFO')
822 877
823 if self.license: 878 if self.license:
838 def tag_python(self): 893 def tag_python(self):
839 ''' 894 '''
840 Get two-digit python version, e.g. 'cp3.8' for python-3.8.6. 895 Get two-digit python version, e.g. 'cp3.8' for python-3.8.6.
841 ''' 896 '''
842 if self.tag_python_: 897 if self.tag_python_:
843 return self.tag_python_ 898 ret = self.tag_python_
844 else: 899 else:
845 return 'cp' + ''.join(platform.python_version().split('.')[:2]) 900 ret = 'cp' + ''.join(platform.python_version().split('.')[:2])
901 assert '-' not in ret
902 return ret
846 903
847 def tag_abi(self): 904 def tag_abi(self):
848 ''' 905 '''
849 ABI tag. 906 ABI tag.
850 ''' 907 '''
896 ret2 = f'{m.group(1)}_0{m.group(2)}' 953 ret2 = f'{m.group(1)}_0{m.group(2)}'
897 log0(f'After macos patch, changing from {ret!r} to {ret2!r}.') 954 log0(f'After macos patch, changing from {ret!r} to {ret2!r}.')
898 ret = ret2 955 ret = ret2
899 956
900 log0( f'tag_platform(): returning {ret=}.') 957 log0( f'tag_platform(): returning {ret=}.')
958 assert '-' not in ret
901 return ret 959 return ret
902 960
903 def wheel_name(self): 961 def wheel_name(self):
904 return f'{_normalise(self.name)}-{self.version}-{self.tag_python()}-{self.tag_abi()}-{self.tag_platform()}.whl' 962 ret = f'{_normalise2(self.name)}-{self.version}-{self.tag_python()}-{self.tag_abi()}-{self.tag_platform()}.whl'
963 assert ret.count('-') == 4, f'Expected 4 dash characters in {ret=}.'
964 return ret
905 965
906 def wheel_name_match(self, wheel): 966 def wheel_name_match(self, wheel):
907 ''' 967 '''
908 Returns true if `wheel` matches our wheel. We basically require the 968 Returns true if `wheel` matches our wheel. We basically require the
909 name to be the same, except that we accept platform tags that contain 969 name to be the same, except that we accept platform tags that contain
928 # This wheel uses Python stable ABI same or older than ours, so 988 # This wheel uses Python stable ABI same or older than ours, so
929 # we can use it. 989 # we can use it.
930 log2(f'py_limited_api; {tag_python=} compatible with {self.tag_python()=}.') 990 log2(f'py_limited_api; {tag_python=} compatible with {self.tag_python()=}.')
931 py_limited_api_compatible = True 991 py_limited_api_compatible = True
932 992
933 log2(f'{_normalise(self.name) == name=}') 993 log2(f'{_normalise2(self.name) == name=}')
934 log2(f'{self.version == version=}') 994 log2(f'{self.version == version=}')
935 log2(f'{self.tag_python() == tag_python=} {self.tag_python()=} {tag_python=}') 995 log2(f'{self.tag_python() == tag_python=} {self.tag_python()=} {tag_python=}')
936 log2(f'{py_limited_api_compatible=}') 996 log2(f'{py_limited_api_compatible=}')
937 log2(f'{self.tag_abi() == tag_abi=}') 997 log2(f'{self.tag_abi() == tag_abi=}')
938 log2(f'{self.tag_platform() in tag_platform.split(".")=}') 998 log2(f'{self.tag_platform() in tag_platform.split(".")=}')
939 log2(f'{self.tag_platform()=}') 999 log2(f'{self.tag_platform()=}')
940 log2(f'{tag_platform.split(".")=}') 1000 log2(f'{tag_platform.split(".")=}')
941 ret = (1 1001 ret = (1
942 and _normalise(self.name) == name 1002 and _normalise2(self.name) == name
943 and self.version == version 1003 and self.version == version
944 and (self.tag_python() == tag_python or py_limited_api_compatible) 1004 and (self.tag_python() == tag_python or py_limited_api_compatible)
945 and self.tag_abi() == tag_abi 1005 and self.tag_abi() == tag_abi
946 and self.tag_platform() in tag_platform.split('.') 1006 and self.tag_platform() in tag_platform.split('.')
947 ) 1007 )
959 ret += f'{value}\n' 1019 ret += f'{value}\n'
960 return ret 1020 return ret
961 1021
962 def _call_fn_build( self, config_settings=None): 1022 def _call_fn_build( self, config_settings=None):
963 assert self.fn_build 1023 assert self.fn_build
1024 assert os.path.isfile('pyproject.toml'), (
1025 'Cannot create package because file does not exist: pyproject.toml'
1026 )
964 log2(f'calling self.fn_build={self.fn_build}') 1027 log2(f'calling self.fn_build={self.fn_build}')
965 if inspect.signature(self.fn_build).parameters: 1028 if inspect.signature(self.fn_build).parameters:
966 ret = self.fn_build(config_settings) 1029 ret = self.fn_build(config_settings)
967 else: 1030 else:
968 ret = self.fn_build() 1031 ret = self.fn_build()
969 assert isinstance( ret, (list, tuple)), \ 1032 assert isinstance( ret, (list, tuple)), \
970 f'Expected list/tuple from {self.fn_build} but got: {ret!r}' 1033 f'Expected list/tuple from {self.fn_build} but got: {ret!r}'
1034
1035 # Check that any extensions that we have built, have same
1036 # py_limited_api value. If package is marked with py_limited_api=True
1037 # then non-py_limited_api extensions seem to fail at runtime on
1038 # Windows.
1039 #
1040 # (We could possibly allow package py_limited_api=False and extensions
1041 # py_limited_api=True, but haven't tested this, and it seems simpler to
1042 # be strict.)
1043 for item in ret:
1044 from_, (to_abs, to_rel) = self._fromto(item)
1045 from_abs = os.path.abspath(from_)
1046 is_py_limited_api = _extensions_to_py_limited_api.get(from_abs)
1047 if is_py_limited_api is not None:
1048 assert bool(self.py_limited_api) == bool(is_py_limited_api), (
1049 f'Extension was built with'
1050 f' py_limited_api={is_py_limited_api} but pipcl.Package'
1051 f' name={self.name!r} has'
1052 f' py_limited_api={self.py_limited_api}:'
1053 f' {from_abs!r}'
1054 )
1055
971 return ret 1056 return ret
972 1057
973 1058
974 def _argv_clean(self, all_): 1059 def _argv_clean(self, all_):
975 ''' 1060 '''
1064 Called by `handle_argv()`. There doesn't seem to be any documentation 1149 Called by `handle_argv()`. There doesn't seem to be any documentation
1065 for `setup.py dist_info`, but it appears to be like `egg_info` except 1150 for `setup.py dist_info`, but it appears to be like `egg_info` except
1066 it writes to a slightly different directory. 1151 it writes to a slightly different directory.
1067 ''' 1152 '''
1068 if root is None: 1153 if root is None:
1069 root = f'{self.name}-{self.version}.dist-info' 1154 root = f'{normalise2(self.name)}-{self.version}.dist-info'
1070 self._write_info(f'{root}/METADATA') 1155 self._write_info(f'{root}/METADATA')
1071 if self.license: 1156 if self.license:
1072 with open( f'{root}/COPYING', 'w') as f: 1157 with open( f'{root}/COPYING', 'w') as f:
1073 f.write( self.license) 1158 f.write( self.license)
1074 1159
1352 f' tag_platform={self.tag_platform_!r}' 1437 f' tag_platform={self.tag_platform_!r}'
1353 '}' 1438 '}'
1354 ) 1439 )
1355 1440
1356 def _dist_info_dir( self): 1441 def _dist_info_dir( self):
1357 return f'{_normalise(self.name)}-{self.version}.dist-info' 1442 return f'{_normalise2(self.name)}-{self.version}.dist-info'
1358 1443
1359 def _metainfo(self): 1444 def _metainfo(self):
1360 ''' 1445 '''
1361 Returns text for `.egg-info/PKG-INFO` file, or `PKG-INFO` in an sdist 1446 Returns text for `.egg-info/PKG-INFO` file, or `PKG-INFO` in an sdist
1362 `.tar.gz` file, or `...dist-info/METADATA` in a wheel. 1447 `.tar.gz` file, or `...dist-info/METADATA` in a wheel.
1465 1550
1466 If `p` is a string we convert to `(p, p)`. Otherwise we assert that 1551 If `p` is a string we convert to `(p, p)`. Otherwise we assert that
1467 `p` is a tuple `(from_, to_)` where `from_` is str/bytes and `to_` is 1552 `p` is a tuple `(from_, to_)` where `from_` is str/bytes and `to_` is
1468 str. If `from_` is a bytes it is contents of file to add, otherwise the 1553 str. If `from_` is a bytes it is contents of file to add, otherwise the
1469 path of an existing file; non-absolute paths are assumed to be relative 1554 path of an existing file; non-absolute paths are assumed to be relative
1470 to `self.root`. If `to_` is empty or ends with `/`, we append the leaf 1555 to `self.root`.
1471 of `from_` (which must be a str). 1556
1557 If `to_` is empty or `/` we set it to the leaf of `from_` (which must
1558 be a str) - i.e. we place the file in the root directory of the wheel;
1559 otherwise if `to_` ends with `/` we append the leaf of `from_` (which
1560 must be a str).
1472 1561
1473 If `to_` starts with `$dist-info/`, we replace this with 1562 If `to_` starts with `$dist-info/`, we replace this with
1474 `self._dist_info_dir()`. 1563 `self._dist_info_dir()`.
1475 1564
1476 If `to_` starts with `$data/`, we replace this with 1565 If `to_` starts with `$data/`, we replace this with
1486 assert isinstance(p, tuple) and len(p) == 2 1575 assert isinstance(p, tuple) and len(p) == 2
1487 1576
1488 from_, to_ = p 1577 from_, to_ = p
1489 assert isinstance(from_, (str, bytes)) 1578 assert isinstance(from_, (str, bytes))
1490 assert isinstance(to_, str) 1579 assert isinstance(to_, str)
1491 if to_.endswith('/') or to_=='': 1580 if to_ == '/' or to_ == '':
1581 to_ = os.path.basename(from_)
1582 elif to_.endswith('/'):
1492 to_ += os.path.basename(from_) 1583 to_ += os.path.basename(from_)
1493 prefix = '$dist-info/' 1584 prefix = '$dist-info/'
1494 if to_.startswith( prefix): 1585 if to_.startswith( prefix):
1495 to_ = f'{self._dist_info_dir()}/{to_[ len(prefix):]}' 1586 to_ = f'{self._dist_info_dir()}/{to_[ len(prefix):]}'
1496 prefix = '$data/' 1587 prefix = '$data/'
1497 if to_.startswith( prefix): 1588 if to_.startswith( prefix):
1498 to_ = f'{self.name}-{self.version}.data/{to_[ len(prefix):]}' 1589 to_ = f'{_normalise2(self.name)}-{self.version}.data/{to_[ len(prefix):]}'
1499 if isinstance(from_, str): 1590 if isinstance(from_, str):
1500 from_, _, _ = self._path_relative_to_root( from_, assert_within_root=False, resolve_symlinks=resolve_symlinks) 1591 from_, _, _ = self._path_relative_to_root( from_, assert_within_root=False, resolve_symlinks=resolve_symlinks)
1501 to_ = self._path_relative_to_root(to_, resolve_symlinks=resolve_symlinks) 1592 to_ = self._path_relative_to_root(to_, resolve_symlinks=resolve_symlinks)
1502 assert isinstance(from_, (str, bytes)) 1593 assert isinstance(from_, (str, bytes))
1503 log2(f'returning {from_=} {to_=}') 1594 log2(f'returning {from_=} {to_=}')
1504 return from_, to_ 1595 return from_, to_
1505 1596
1597 _extensions_to_py_limited_api = dict()
1506 1598
1507 def build_extension( 1599 def build_extension(
1508 name, 1600 name,
1509 path_i, 1601 path_i,
1510 outdir, 1602 outdir,
1603 *,
1511 builddir=None, 1604 builddir=None,
1512 includes=None, 1605 includes=None,
1513 defines=None, 1606 defines=None,
1514 libpaths=None, 1607 libpaths=None,
1515 libs=None, 1608 libs=None,
1517 debug=False, 1610 debug=False,
1518 compiler_extra='', 1611 compiler_extra='',
1519 linker_extra='', 1612 linker_extra='',
1520 swig=None, 1613 swig=None,
1521 cpp=True, 1614 cpp=True,
1615 source_extra=None,
1522 prerequisites_swig=None, 1616 prerequisites_swig=None,
1523 prerequisites_compile=None, 1617 prerequisites_compile=None,
1524 prerequisites_link=None, 1618 prerequisites_link=None,
1525 infer_swig_includes=True, 1619 infer_swig_includes=True,
1526 py_limited_api=False, 1620 py_limited_api=False,
1558 `/LIBPATH:` on Windows or `-L` on Unix. 1652 `/LIBPATH:` on Windows or `-L` on Unix.
1559 libs 1653 libs
1560 A string, or a sequence of library names. Each item is prefixed 1654 A string, or a sequence of library names. Each item is prefixed
1561 with `-l` on non-Windows. 1655 with `-l` on non-Windows.
1562 optimise: 1656 optimise:
1563 Whether to use compiler optimisations. 1657 Whether to use compiler optimisations and define NDEBUG.
1564 debug: 1658 debug:
1565 Whether to build with debug symbols. 1659 Whether to build with debug symbols.
1566 compiler_extra: 1660 compiler_extra:
1567 Extra compiler flags. Can be None. 1661 Extra compiler flags. Can be None.
1568 linker_extra: 1662 linker_extra:
1569 Extra linker flags. Can be None. 1663 Extra linker flags. Can be None.
1570 swig: 1664 swig:
1571 Swig command; if false we use 'swig'. 1665 Swig command; if false we use 'swig'.
1572 cpp: 1666 cpp:
1573 If true we tell SWIG to generate C++ code instead of C. 1667 If true we tell SWIG to generate C++ code instead of C.
1668 source_extra:
1669 Extra source files to build into the shared library,
1574 prerequisites_swig: 1670 prerequisites_swig:
1575 prerequisites_compile: 1671 prerequisites_compile:
1576 prerequisites_link: 1672 prerequisites_link:
1577 1673
1578 [These are mainly for use on Windows. On other systems we 1674 [These are mainly for use on Windows. On other systems we
1603 infer_swig_includes: 1699 infer_swig_includes:
1604 If true, we extract `-I<path>` and `-I <path>` args from 1700 If true, we extract `-I<path>` and `-I <path>` args from
1605 `compile_extra` (also `/I` on windows) and use them with swig so 1701 `compile_extra` (also `/I` on windows) and use them with swig so
1606 that it can see the same header files as C/C++. This is useful 1702 that it can see the same header files as C/C++. This is useful
1607 when using enviromment variables such as `CC` and `CXX` to set 1703 when using enviromment variables such as `CC` and `CXX` to set
1608 `compile_extra. 1704 `compile_extra`.
1609 py_limited_api: 1705 py_limited_api:
1610 If true we build for current Python's limited API / stable ABI. 1706 If true we build for current Python's limited API / stable ABI.
1707
1708 Note that we will assert false if this extension is added to a
1709 pipcl.Package that has a different <py_limited_api>, because
1710 on Windows importing a non-py_limited_api extension inside a
1711 py_limited=True package fails.
1611 1712
1612 Returns the leafname of the generated library file within `outdir`, e.g. 1713 Returns the leafname of the generated library file within `outdir`, e.g.
1613 `_{name}.so` on Unix or `_{name}.cp311-win_amd64.pyd` on Windows. 1714 `_{name}.so` on Unix or `_{name}.cp311-win_amd64.pyd` on Windows.
1614 ''' 1715 '''
1615 if compiler_extra is None: 1716 if compiler_extra is None:
1618 linker_extra = '' 1719 linker_extra = ''
1619 if builddir is None: 1720 if builddir is None:
1620 builddir = outdir 1721 builddir = outdir
1621 if not swig: 1722 if not swig:
1622 swig = 'swig' 1723 swig = 'swig'
1724
1725 if source_extra is None:
1726 source_extra = list()
1727 if isinstance(source_extra, str):
1728 source_extra = [source_extra]
1729
1623 includes_text = _flags( includes, '-I') 1730 includes_text = _flags( includes, '-I')
1624 defines_text = _flags( defines, '-D') 1731 defines_text = _flags( defines, '-D')
1625 libpaths_text = _flags( libpaths, '/LIBPATH:', '"') if windows() else _flags( libpaths, '-L') 1732 libpaths_text = _flags( libpaths, '/LIBPATH:', '"') if windows() else _flags( libpaths, '-L')
1626 libs_text = _flags( libs, '' if windows() else '-l') 1733 libs_text = _flags( libs, '' if windows() else '-l')
1627 path_cpp = f'{builddir}/{os.path.basename(path_i)}' 1734 path_cpp = f'{builddir}/{os.path.basename(path_i)}'
1628 path_cpp += '.cpp' if cpp else '.c' 1735 path_cpp += '.cpp' if cpp else '.c'
1629 os.makedirs( outdir, exist_ok=True) 1736 os.makedirs( outdir, exist_ok=True)
1630 1737
1631 # Run SWIG. 1738 # Run SWIG.
1632 1739 #
1633 if infer_swig_includes: 1740 if infer_swig_includes:
1634 # Extract include flags from `compiler_extra`. 1741 # Extract include flags from `compiler_extra`.
1635 swig_includes_extra = '' 1742 swig_includes_extra = ''
1636 compiler_extra_items = compiler_extra.split() 1743 compiler_extra_items = shlex.split(compiler_extra)
1637 i = 0 1744 i = 0
1638 while i < len(compiler_extra_items): 1745 while i < len(compiler_extra_items):
1639 item = compiler_extra_items[i] 1746 item = compiler_extra_items[i]
1640 # Swig doesn't seem to like a space after `I`. 1747 # Swig doesn't seem to like a space after `I`.
1641 if item == '-I' or (windows() and item == '/I'): 1748 if item == '-I' or (windows() and item == '/I'):
1666 path_i, 1773 path_i,
1667 prerequisites_swig, 1774 prerequisites_swig,
1668 prerequisites_swig2, 1775 prerequisites_swig2,
1669 ) 1776 )
1670 1777
1671 so_suffix = _so_suffix(use_so_versioning = not py_limited_api) 1778 if pyodide():
1779 so_suffix = '.so'
1780 log0(f'pyodide: PEP-3149 suffix untested, so omitting. {_so_suffix()=}.')
1781 else:
1782 so_suffix = _so_suffix(use_so_versioning = not py_limited_api)
1672 path_so_leaf = f'_{name}{so_suffix}' 1783 path_so_leaf = f'_{name}{so_suffix}'
1673 path_so = f'{outdir}/{path_so_leaf}' 1784 path_so = f'{outdir}/{path_so_leaf}'
1674 1785
1675 py_limited_api2 = current_py_limited_api() if py_limited_api else None 1786 py_limited_api2 = current_py_limited_api() if py_limited_api else None
1676 1787
1788 compiler_command, pythonflags = base_compiler(cpp=cpp)
1789 linker_command, _ = base_linker(cpp=cpp)
1790 # setuptools on Linux seems to use slightly different compile flags:
1791 #
1792 # -fwrapv -O3 -Wall -O2 -g0 -DPY_CALL_TRAMPOLINE
1793 #
1794
1795 general_flags = ''
1677 if windows(): 1796 if windows():
1678 path_obj = f'{path_so}.obj'
1679
1680 permissive = '/permissive-' 1797 permissive = '/permissive-'
1681 EHsc = '/EHsc' 1798 EHsc = '/EHsc'
1682 T = '/Tp' if cpp else '/Tc' 1799 T = '/Tp' if cpp else '/Tc'
1683 optimise2 = '/DNDEBUG /O2' if optimise else '/D_DEBUG' 1800 optimise2 = '/DNDEBUG /O2' if optimise else '/D_DEBUG'
1684 debug2 = '' 1801 debug2 = '/Zi' if debug else ''
1802 py_limited_api3 = f'/DPy_LIMITED_API={py_limited_api2}' if py_limited_api2 else ''
1803
1804 else:
1685 if debug: 1805 if debug:
1686 debug2 = '/Zi' # Generate .pdb. 1806 general_flags += '/Zi' if windows() else ' -g'
1687 # debug2 = '/Z7' # Embed debug info in .obj files. 1807 if optimise:
1688 1808 general_flags += ' /DNDEBUG /O2' if windows() else ' -O2 -DNDEBUG'
1689 py_limited_api3 = f'/DPy_LIMITED_API={py_limited_api2}' if py_limited_api2 else '' 1809
1690 1810 py_limited_api3 = f'-DPy_LIMITED_API={py_limited_api2}' if py_limited_api2 else ''
1691 # As of 2023-08-23, it looks like VS tools create slightly 1811
1692 # .dll's each time, even with identical inputs. 1812 if windows():
1813 pass
1814 elif darwin():
1815 # MacOS's linker does not like `-z origin`.
1816 rpath_flag = "-Wl,-rpath,@loader_path/"
1817 # Avoid `Undefined symbols for ... "_PyArg_UnpackTuple" ...'.
1818 general_flags += ' -undefined dynamic_lookup'
1819 elif pyodide():
1820 # Setting `-Wl,-rpath,'$ORIGIN',-z,origin` gives:
1821 # emcc: warning: ignoring unsupported linker flag: `-rpath` [-Wlinkflags]
1822 # wasm-ld: error: unknown -z value: origin
1693 # 1823 #
1694 # Some info about this is at: 1824 rpath_flag = "-Wl,-rpath,'$ORIGIN'"
1695 # https://nikhilism.com/post/2020/windows-deterministic-builds/. 1825 else:
1696 # E.g. an undocumented linker flag `/Brepro`. 1826 rpath_flag = "-Wl,-rpath,'$ORIGIN',-z,origin"
1697 # 1827
1698 1828 # Fun fact - on Linux, if the -L and -l options are before '{path_cpp}'
1699 command, pythonflags = base_compiler(cpp=cpp) 1829 # they seem to be ignored...
1700 command = f''' 1830 #
1701 {command} 1831 path_os = list()
1702 # General: 1832
1703 /c # Compiles without linking. 1833 for path_source in [path_cpp] + source_extra:
1704 {EHsc} # Enable "Standard C++ exception handling". 1834 path_o = f'{path_source}.obj' if windows() else f'{path_source}.o'
1705 1835 path_os.append(f' {path_o}')
1706 #/MD # Creates a multithreaded DLL using MSVCRT.lib. 1836
1707 {'/MDd' if debug else '/MD'} 1837 prerequisites_path = f'{path_o}.d'
1708 1838
1709 # Input/output files: 1839 if windows():
1710 {T}{path_cpp} # /Tp specifies C++ source file. 1840 compiler_command2 = f'''
1711 /Fo{path_obj} # Output file. codespell:ignore 1841 {compiler_command}
1712 1842 # General:
1713 # Include paths: 1843 /c # Compiles without linking.
1714 {includes_text} 1844 {EHsc} # Enable "Standard C++ exception handling".
1715 {pythonflags.includes} # Include path for Python headers. 1845
1716 1846 #/MD # Creates a multithreaded DLL using MSVCRT.lib.
1717 # Code generation: 1847 {'/MDd' if debug else '/MD'}
1718 {optimise2} 1848
1719 {debug2} 1849 # Input/output files:
1720 {permissive} # Set standard-conformance mode. 1850 {T}{path_source} # /Tp specifies C++ source file.
1721 1851 /Fo{path_o} # Output file. codespell:ignore
1722 # Diagnostics: 1852
1723 #/FC # Display full path of source code files passed to cl.exe in diagnostic text. 1853 # Include paths:
1724 /W3 # Sets which warning level to output. /W3 is IDE default. 1854 {includes_text}
1725 /diagnostics:caret # Controls the format of diagnostic messages. 1855 {pythonflags.includes} # Include path for Python headers.
1726 /nologo # 1856
1727 1857 # Code generation:
1728 {defines_text} 1858 {optimise2}
1729 {compiler_extra} 1859 {debug2}
1730 1860 {permissive} # Set standard-conformance mode.
1731 {py_limited_api3} 1861
1732 ''' 1862 # Diagnostics:
1733 run_if( command, path_obj, path_cpp, prerequisites_compile) 1863 #/FC # Display full path of source code files passed to cl.exe in diagnostic text.
1734 1864 /W3 # Sets which warning level to output. /W3 is IDE default.
1735 command, pythonflags = base_linker(cpp=cpp) 1865 /diagnostics:caret # Controls the format of diagnostic messages.
1866 /nologo #
1867
1868 {defines_text}
1869 {compiler_extra}
1870
1871 {py_limited_api3}
1872 '''
1873
1874 else:
1875 compiler_command2 = f'''
1876 {compiler_command}
1877 -fPIC
1878 {general_flags.strip()}
1879 {pythonflags.includes}
1880 {includes_text}
1881 {defines_text}
1882 -MD -MF {prerequisites_path}
1883 -c {path_source}
1884 -o {path_o}
1885 {compiler_extra}
1886 {py_limited_api3}
1887 '''
1888 run_if(
1889 compiler_command2,
1890 path_o,
1891 path_source,
1892 [path_source] + _get_prerequisites(prerequisites_path),
1893 )
1894
1895 # Link
1896 prerequisites_path = f'{path_so}.d'
1897 if windows():
1736 debug2 = '/DEBUG' if debug else '' 1898 debug2 = '/DEBUG' if debug else ''
1737 base, _ = os.path.splitext(path_so_leaf) 1899 base, _ = os.path.splitext(path_so_leaf)
1738 command = f''' 1900 command2 = f'''
1739 {command} 1901 {linker_command}
1740 /DLL # Builds a DLL. 1902 /DLL # Builds a DLL.
1741 /EXPORT:PyInit__{name} # Exports a function. 1903 /EXPORT:PyInit__{name} # Exports a function.
1742 /IMPLIB:{base}.lib # Overrides the default import library name. 1904 /IMPLIB:{base}.lib # Overrides the default import library name.
1743 {libpaths_text} 1905 {libpaths_text}
1744 {pythonflags.ldflags} 1906 {pythonflags.ldflags}
1745 /OUT:{path_so} # Specifies the output file name. 1907 /OUT:{path_so} # Specifies the output file name.
1746 {debug2} 1908 {debug2}
1747 /nologo 1909 /nologo
1748 {libs_text} 1910 {libs_text}
1749 {path_obj} 1911 {' '.join(path_os)}
1750 {linker_extra} 1912 {linker_extra}
1751 ''' 1913 '''
1752 run_if( command, path_so, path_obj, prerequisites_link)
1753
1754 else:
1755
1756 # Not Windows.
1757 #
1758 command, pythonflags = base_compiler(cpp=cpp)
1759
1760 # setuptools on Linux seems to use slightly different compile flags:
1761 #
1762 # -fwrapv -O3 -Wall -O2 -g0 -DPY_CALL_TRAMPOLINE
1763 #
1764
1765 general_flags = ''
1766 if debug:
1767 general_flags += ' -g'
1768 if optimise:
1769 general_flags += ' -O2 -DNDEBUG'
1770 if os.environ.get('EXTRA_CHECKS', '1') != '0': 1914 if os.environ.get('EXTRA_CHECKS', '1') != '0':
1771 general_flags += ' -fno-delete-null-pointer-checks' 1915 general_flags += ' -fno-delete-null-pointer-checks'
1772 general_flags += ' -Werror=implicit-function-declaration' 1916 general_flags += ' -Werror=implicit-function-declaration'
1773 general_flags += ' -fstack-clash-protection' 1917 general_flags += ' -fstack-clash-protection'
1774 general_flags += ' -fstack-protector-strong' 1918 general_flags += ' -fstack-protector-strong'
1775 1919 elif pyodide():
1776 py_limited_api3 = f'-DPy_LIMITED_API={py_limited_api2}' if py_limited_api2 else '' 1920 command2 = f'''
1777 1921 {linker_command}
1778 if darwin(): 1922 -MD -MF {prerequisites_path}
1779 # MacOS's linker does not like `-z origin`. 1923 -o {path_so}
1780 rpath_flag = "-Wl,-rpath,@loader_path/" 1924 {' '.join(path_os)}
1781 1925 {libpaths_text}
1782 # Avoid `Undefined symbols for ... "_PyArg_UnpackTuple" ...'. 1926 {libs_text}
1783 general_flags += ' -undefined dynamic_lookup' 1927 {linker_extra}
1784 elif pyodide(): 1928 {pythonflags.ldflags}
1785 # Setting `-Wl,-rpath,'$ORIGIN',-z,origin` gives: 1929 {rpath_flag}
1786 # emcc: warning: ignoring unsupported linker flag: `-rpath` [-Wlinkflags] 1930 '''
1787 # wasm-ld: error: unknown -z value: origin 1931 else:
1788 # 1932 command2 = f'''
1789 log0(f'pyodide: PEP-3149 suffix untested, so omitting. {_so_suffix()=}.') 1933 {linker_command}
1790 path_so_leaf = f'_{name}.so' 1934 -shared
1791 path_so = f'{outdir}/{path_so_leaf}' 1935 {general_flags.strip()}
1792 1936 -MD -MF {prerequisites_path}
1793 rpath_flag = '' 1937 -o {path_so}
1794 else: 1938 {' '.join(path_os)}
1795 rpath_flag = "-Wl,-rpath,'$ORIGIN',-z,origin" 1939 {libpaths_text}
1796 path_so = f'{outdir}/{path_so_leaf}' 1940 {libs_text}
1797 # Fun fact - on Linux, if the -L and -l options are before '{path_cpp}' 1941 {linker_extra}
1798 # they seem to be ignored... 1942 {pythonflags.ldflags}
1799 # 1943 {rpath_flag}
1800 prerequisites = list() 1944 {py_limited_api3}
1801 1945 '''
1802 if pyodide(): 1946 link_was_run = run_if(
1803 # Looks like pyodide's `cc` can't compile and link in one invocation. 1947 command2,
1804 prerequisites_compile_path = f'{path_cpp}.o.d' 1948 path_so,
1805 prerequisites += _get_prerequisites( prerequisites_compile_path) 1949 path_cpp,
1806 command = f''' 1950 *path_os,
1807 {command} 1951 *_get_prerequisites(f'{path_so}.d'),
1808 -fPIC 1952 )
1809 {general_flags.strip()} 1953
1810 {pythonflags.includes} 1954 if link_was_run and darwin():
1811 {includes_text} 1955 # We need to patch up references to shared libraries in `libs`.
1812 {defines_text} 1956 sublibraries = list()
1813 -MD -MF {prerequisites_compile_path} 1957 for lib in () if libs is None else libs:
1814 -c {path_cpp} 1958 for libpath in libpaths:
1815 -o {path_cpp}.o 1959 found = list()
1816 {compiler_extra} 1960 for suffix in '.so', '.dylib':
1817 {py_limited_api3} 1961 path = f'{libpath}/lib{os.path.basename(lib)}{suffix}'
1818 ''' 1962 if os.path.exists( path):
1819 prerequisites_link_path = f'{path_cpp}.o.d' 1963 found.append( path)
1820 prerequisites += _get_prerequisites( prerequisites_link_path) 1964 if found:
1821 ld, _ = base_linker(cpp=cpp) 1965 assert len(found) == 1, f'More than one file matches lib={lib!r}: {found}'
1822 command += f''' 1966 sublibraries.append( found[0])
1823 && {ld} 1967 break
1824 {path_cpp}.o 1968 else:
1825 -o {path_so} 1969 log2(f'Warning: can not find path of lib={lib!r} in libpaths={libpaths}')
1826 -MD -MF {prerequisites_link_path} 1970 macos_patch( path_so, *sublibraries)
1827 {rpath_flag}
1828 {libpaths_text}
1829 {libs_text}
1830 {linker_extra}
1831 {pythonflags.ldflags}
1832 '''
1833 else:
1834 # We use compiler to compile and link in one command.
1835 prerequisites_path = f'{path_so}.d'
1836 prerequisites = _get_prerequisites(prerequisites_path)
1837
1838 command = f'''
1839 {command}
1840 -fPIC
1841 -shared
1842 {general_flags.strip()}
1843 {pythonflags.includes}
1844 {includes_text}
1845 {defines_text}
1846 {path_cpp}
1847 -MD -MF {prerequisites_path}
1848 -o {path_so}
1849 {compiler_extra}
1850 {libpaths_text}
1851 {linker_extra}
1852 {pythonflags.ldflags}
1853 {libs_text}
1854 {rpath_flag}
1855 {py_limited_api3}
1856 '''
1857 command_was_run = run_if(
1858 command,
1859 path_so,
1860 path_cpp,
1861 prerequisites_compile,
1862 prerequisites_link,
1863 prerequisites,
1864 )
1865
1866 if command_was_run and darwin():
1867 # We need to patch up references to shared libraries in `libs`.
1868 sublibraries = list()
1869 for lib in () if libs is None else libs:
1870 for libpath in libpaths:
1871 found = list()
1872 for suffix in '.so', '.dylib':
1873 path = f'{libpath}/lib{os.path.basename(lib)}{suffix}'
1874 if os.path.exists( path):
1875 found.append( path)
1876 if found:
1877 assert len(found) == 1, f'More than one file matches lib={lib!r}: {found}'
1878 sublibraries.append( found[0])
1879 break
1880 else:
1881 log2(f'Warning: can not find path of lib={lib!r} in libpaths={libpaths}')
1882 macos_patch( path_so, *sublibraries)
1883 1971
1884 #run(f'ls -l {path_so}', check=0) 1972 #run(f'ls -l {path_so}', check=0)
1885 #run(f'file {path_so}', check=0) 1973 #run(f'file {path_so}', check=0)
1886 1974
1975 _extensions_to_py_limited_api[os.path.abspath(path_so)] = py_limited_api
1976
1887 return path_so_leaf 1977 return path_so_leaf
1888 1978
1889 1979
1890 # Functions that might be useful. 1980 # Functions that might be useful.
1891 # 1981 #
2007 capture=1, 2097 capture=1,
2008 check=0 2098 check=0
2009 ) 2099 )
2010 if not e: 2100 if not e:
2011 branch = out.strip() 2101 branch = out.strip()
2012 log(f'git_info(): directory={directory!r} returning branch={branch!r} sha={sha!r} comment={comment!r}') 2102 log1(f'git_info(): directory={directory!r} returning branch={branch!r} sha={sha!r} comment={comment!r}')
2013 return sha, comment, diff, branch 2103 return sha, comment, diff, branch
2014 2104
2015 2105
2016 def git_items( directory, submodules=False): 2106 def git_items( directory, submodules=False):
2017 ''' 2107 '''
2056 ret.append(path) 2146 ret.append(path)
2057 return ret 2147 return ret
2058 2148
2059 2149
2060 def git_get( 2150 def git_get(
2061 remote,
2062 local, 2151 local,
2063 *, 2152 *,
2153 remote=None,
2064 branch=None, 2154 branch=None,
2155 tag=None,
2156 text=None,
2065 depth=1, 2157 depth=1,
2066 env_extra=None, 2158 env_extra=None,
2067 tag=None,
2068 update=True, 2159 update=True,
2069 submodules=True, 2160 submodules=True,
2070 default_remote=None,
2071 ): 2161 ):
2072 ''' 2162 '''
2073 Ensures that <local> is a git checkout (at either <tag>, or <branch> HEAD) 2163 Creates/updates local checkout <local> of remote repository and returns
2074 of a remote repository. 2164 absolute path of <local>.
2075 2165
2076 Exactly one of <branch> and <tag> must be specified, or <remote> must start 2166 If <text> is set but does not start with 'git:', it is assumed to be an up
2077 with 'git:' and match the syntax described below. 2167 to date local checkout, and we return absolute path of <text> without doing
2168 any git operations.
2078 2169
2079 Args: 2170 Args:
2171 local:
2172 Local directory. Created and/or updated using `git clone` and `git
2173 fetch` etc.
2080 remote: 2174 remote:
2081 Remote git repostitory, for example 2175 Remote git repostitory, for example
2082 'https://github.com/ArtifexSoftware/mupdf.git'. 2176 'https://github.com/ArtifexSoftware/mupdf.git'. Can be overridden
2177 by <text>.
2178 branch:
2179 Branch to use; can be overridden by <text>.
2180 tag:
2181 Tag to use; can be overridden by <text>.
2182 text:
2183 If None or empty:
2184 Ignored.
2083 2185
2084 If starts with 'git:', the remaining text should be a command-line 2186 If starts with 'git:':
2085 style string containing some or all of these args: 2187 The remaining text should be a command-line
2086 --branch <branch> 2188 style string containing some or all of these args:
2087 --tag <tag> 2189 --branch <branch>
2088 <remote> 2190 --tag <tag>
2089 These overrides <branch>, <tag> and <default_remote>. 2191 <remote>
2192 These overrides <branch>, <tag> and <remote>.
2193 Otherwise:
2194 <text> is assumed to be a local directory, and we simply return
2195 it as an absolute path without doing any git operations.
2090 2196
2091 For example these all clone/update/branch master of https://foo.bar/qwerty.git to local 2197 For example these all clone/update/branch master of https://foo.bar/qwerty.git to local
2092 checkout 'foo-local': 2198 checkout 'foo-local':
2093 2199
2094 git_get('https://foo.bar/qwerty.git', 'foo-local', branch='master') 2200 git_get('foo-local', remote='https://foo.bar/qwerty.git', branch='master')
2095 git_get('git:--branch master https://foo.bar/qwerty.git', 'foo-local') 2201 git_get('foo-local', text='git:--branch master https://foo.bar/qwerty.git')
2096 git_get('git:--branch master', 'foo-local', default_remote='https://foo.bar/qwerty.git') 2202 git_get('foo-local', text='git:--branch master', remote='https://foo.bar/qwerty.git')
2097 git_get('git:', 'foo-local', branch='master', default_remote='https://foo.bar/qwerty.git') 2203 git_get('foo-local', text='git:', branch='master', remote='https://foo.bar/qwerty.git')
2098
2099 local:
2100 Local directory. If <local>/.git exists, we attempt to run `git
2101 update` in it.
2102 branch:
2103 Branch to use. Is used as default if remote starts with 'git:'.
2104 depth: 2204 depth:
2105 Depth of local checkout when cloning and fetching, or None. 2205 Depth of local checkout when cloning and fetching, or None.
2106 env_extra: 2206 env_extra:
2107 Dict of extra name=value environment variables to use whenever we 2207 Dict of extra name=value environment variables to use whenever we
2108 run git. 2208 run git.
2109 tag:
2110 Tag to use. Is used as default if remote starts with 'git:'.
2111 update: 2209 update:
2112 If false we do not update existing repository. Might be useful if 2210 If false we do not update existing repository. Might be useful if
2113 testing without network access. 2211 testing without network access.
2114 submodules: 2212 submodules:
2115 If true, we clone with `--recursive --shallow-submodules` and run 2213 If true, we clone with `--recursive --shallow-submodules` and run
2116 `git submodule update --init --recursive` before returning. 2214 `git submodule update --init --recursive` before returning.
2117 default_remote:
2118 The remote URL if <remote> starts with 'git:' but does not specify
2119 the remote URL.
2120 ''' 2215 '''
2121 log0(f'{remote=} {local=} {branch=} {tag=}') 2216 log0(f'{remote=} {local=} {branch=} {tag=}')
2122 if remote.startswith('git:'): 2217
2123 remote0 = remote 2218 if text:
2124 args = iter(shlex.split(remote0[len('git:'):])) 2219 if text.startswith('git:'):
2125 remote = default_remote 2220 args = iter(shlex.split(text[len('git:'):]))
2126 while 1: 2221 while 1:
2127 try: 2222 try:
2128 arg = next(args) 2223 arg = next(args)
2129 except StopIteration: 2224 except StopIteration:
2130 break 2225 break
2131 if arg == '--branch': 2226 if arg == '--branch':
2132 branch = next(args) 2227 branch = next(args)
2133 tag = None 2228 tag = None
2134 elif arg == '--tag': 2229 elif arg == '--tag':
2135 tag == next(args) 2230 tag = next(args)
2136 branch = None 2231 branch = None
2137 else: 2232 else:
2138 remote = arg 2233 remote = arg
2139 assert remote, f'{default_remote=} and no remote specified in remote={remote0!r}.' 2234 assert remote, f'<remote> unset and no remote specified in {text=}.'
2140 assert branch or tag, f'{branch=} {tag=} and no branch/tag specified in remote={remote0!r}.' 2235 assert branch or tag, f'<branch> and <tag> unset and no branch/tag specified in {text=}.'
2236 else:
2237 log0(f'Using local directory {text!r}.')
2238 return os.path.abspath(text)
2141 2239
2142 assert (branch and not tag) or (not branch and tag), f'Must specify exactly one of <branch> and <tag>.' 2240 assert (branch and not tag) or (not branch and tag), f'Must specify exactly one of <branch> and <tag>; {branch=} {tag=}.'
2143 2241
2144 depth_arg = f' --depth {depth}' if depth else '' 2242 depth_arg = f' --depth {depth}' if depth else ''
2145 2243
2146 def do_update(): 2244 def do_update():
2147 # This seems to pull in the entire repository. 2245 # This seems to pull in the entire repository.
2148 log0(f'do_update(): attempting to update {local=}.') 2246 log0(f'do_update(): attempting to update {local=}.')
2149 # Remove any local changes. 2247 # Remove any local changes.
2150 run(f'cd {local} && git checkout .', env_extra=env_extra) 2248 run(f'cd {local} && git reset --hard', env_extra=env_extra)
2151 if tag: 2249 if tag:
2152 # `-u` avoids `fatal: Refusing to fetch into current branch`. 2250 # `-u` avoids `fatal: Refusing to fetch into current branch`.
2153 # Using '+' and `revs/tags/` prefix seems to avoid errors like: 2251 # Using '+' and `revs/tags/` prefix seems to avoid errors like:
2154 # error: cannot update ref 'refs/heads/v3.16.44': 2252 # error: cannot update ref 'refs/heads/v3.16.44':
2155 # trying to write non-commit object 2253 # trying to write non-commit object
2193 if submodules: 2291 if submodules:
2194 run(f'cd {local} && git submodule update --init --recursive', env_extra=env_extra) 2292 run(f'cd {local} && git submodule update --init --recursive', env_extra=env_extra)
2195 2293
2196 # Show sha of checkout. 2294 # Show sha of checkout.
2197 run( f'cd {local} && git show --pretty=oneline|head -n 1', check=False) 2295 run( f'cd {local} && git show --pretty=oneline|head -n 1', check=False)
2296 return os.path.abspath(local)
2198 2297
2199 2298
2200 def run( 2299 def run(
2201 command, 2300 command,
2202 *, 2301 *,
2484 ldflags2 = self.ldflags.replace(' -lcrypt ', ' ') 2583 ldflags2 = self.ldflags.replace(' -lcrypt ', ' ')
2485 if ldflags2 != self.ldflags: 2584 if ldflags2 != self.ldflags:
2486 log2(f'### Have removed `-lcrypt` from ldflags: {self.ldflags!r} -> {ldflags2!r}') 2585 log2(f'### Have removed `-lcrypt` from ldflags: {self.ldflags!r} -> {ldflags2!r}')
2487 self.ldflags = ldflags2 2586 self.ldflags = ldflags2
2488 2587
2489 log1(f'{self.includes=}') 2588 if 0:
2490 log1(f' {includes_=}') 2589 log1(f'{self.includes=}')
2491 log1(f'{self.ldflags=}') 2590 log1(f' {includes_=}')
2492 log1(f' {ldflags_=}') 2591 log1(f'{self.ldflags=}')
2592 log1(f' {ldflags_=}')
2493 2593
2494 2594
2495 def macos_add_cross_flags(command): 2595 def macos_add_cross_flags(command):
2496 ''' 2596 '''
2497 If running on MacOS and environment variables ARCHFLAGS is set 2597 If running on MacOS and environment variables ARCHFLAGS is set
2587 ''' 2687 '''
2588 #log(f'sys.maxsize={hex(sys.maxsize)}') 2688 #log(f'sys.maxsize={hex(sys.maxsize)}')
2589 return f'x{32 if sys.maxsize == 2**31 - 1 else 64}' 2689 return f'x{32 if sys.maxsize == 2**31 - 1 else 64}'
2590 2690
2591 2691
2592 def run_if( command, out, *prerequisites): 2692 def run_if( command, out, *prerequisites, caller=1):
2593 ''' 2693 '''
2594 Runs a command only if the output file is not up to date. 2694 Runs a command only if the output file is not up to date.
2595 2695
2596 Args: 2696 Args:
2597 command: 2697 command:
2617 >>> out = 'run_if_test_out' 2717 >>> out = 'run_if_test_out'
2618 >>> if os.path.exists( out): 2718 >>> if os.path.exists( out):
2619 ... os.remove( out) 2719 ... os.remove( out)
2620 >>> if os.path.exists( f'{out}.cmd'): 2720 >>> if os.path.exists( f'{out}.cmd'):
2621 ... os.remove( f'{out}.cmd') 2721 ... os.remove( f'{out}.cmd')
2622 >>> run_if( f'touch {out}', out) 2722 >>> run_if( f'touch {out}', out, caller=0)
2623 pipcl.py:run_if(): Running command because: File does not exist: 'run_if_test_out' 2723 pipcl.py:run_if(): Running command because: File does not exist: 'run_if_test_out'
2624 pipcl.py:run_if(): Running: touch run_if_test_out 2724 pipcl.py:run_if(): Running: touch run_if_test_out
2625 True 2725 True
2626 2726
2627 If we repeat, the output file will be up to date so the command is not run: 2727 If we repeat, the output file will be up to date so the command is not run:
2628 2728
2629 >>> run_if( f'touch {out}', out) 2729 >>> run_if( f'touch {out}', out, caller=0)
2630 pipcl.py:run_if(): Not running command because up to date: 'run_if_test_out' 2730 pipcl.py:run_if(): Not running command because up to date: 'run_if_test_out'
2631 2731
2632 If we change the command, the command is run: 2732 If we change the command, the command is run:
2633 2733
2634 >>> run_if( f'touch {out}', out) 2734 >>> run_if( f'touch {out};', out, caller=0)
2635 pipcl.py:run_if(): Running command because: Command has changed 2735 pipcl.py:run_if(): Running command because: Command has changed:
2636 pipcl.py:run_if(): Running: touch run_if_test_out 2736 pipcl.py:run_if(): @@ -1,2 +1,2 @@
2737 pipcl.py:run_if(): touch
2738 pipcl.py:run_if(): -run_if_test_out
2739 pipcl.py:run_if(): +run_if_test_out;
2740 pipcl.py:run_if():
2741 pipcl.py:run_if(): Running: touch run_if_test_out;
2637 True 2742 True
2638 2743
2639 If we add a prerequisite that is newer than the output, the command is run: 2744 If we add a prerequisite that is newer than the output, the command is run:
2640 2745
2641 >>> time.sleep(1) 2746 >>> time.sleep(1)
2642 >>> prerequisite = 'run_if_test_prerequisite' 2747 >>> prerequisite = 'run_if_test_prerequisite'
2643 >>> run( f'touch {prerequisite}', caller=0) 2748 >>> run( f'touch {prerequisite}', caller=0)
2644 pipcl.py:run(): Running: touch run_if_test_prerequisite 2749 pipcl.py:run(): Running: touch run_if_test_prerequisite
2645 >>> run_if( f'touch {out}', out, prerequisite) 2750 >>> run_if( f'touch {out}', out, prerequisite, caller=0)
2646 pipcl.py:run_if(): Running command because: Prerequisite is new: 'run_if_test_prerequisite' 2751 pipcl.py:run_if(): Running command because: Command has changed:
2752 pipcl.py:run_if(): @@ -1,2 +1,2 @@
2753 pipcl.py:run_if(): touch
2754 pipcl.py:run_if(): -run_if_test_out;
2755 pipcl.py:run_if(): +run_if_test_out
2756 pipcl.py:run_if():
2647 pipcl.py:run_if(): Running: touch run_if_test_out 2757 pipcl.py:run_if(): Running: touch run_if_test_out
2648 True 2758 True
2649 2759
2650 If we repeat, the output will be newer than the prerequisite, so the 2760 If we repeat, the output will be newer than the prerequisite, so the
2651 command is not run: 2761 command is not run:
2652 2762
2653 >>> run_if( f'touch {out}', out, prerequisite) 2763 >>> run_if( f'touch {out}', out, prerequisite, caller=0)
2654 pipcl.py:run_if(): Not running command because up to date: 'run_if_test_out' 2764 pipcl.py:run_if(): Not running command because up to date: 'run_if_test_out'
2655 ''' 2765 '''
2656 doit = False 2766 doit = False
2657 cmd_path = f'{out}.cmd' 2767 cmd_path = f'{out}.cmd'
2658 2768
2665 if os.path.isfile( cmd_path): 2775 if os.path.isfile( cmd_path):
2666 with open( cmd_path) as f: 2776 with open( cmd_path) as f:
2667 cmd = f.read() 2777 cmd = f.read()
2668 else: 2778 else:
2669 cmd = None 2779 cmd = None
2670 if command != cmd: 2780 cmd_args = shlex.split(cmd or '')
2781 command_args = shlex.split(command or '')
2782 if command_args != cmd_args:
2671 if cmd is None: 2783 if cmd is None:
2672 doit = 'No previous command stored' 2784 doit = 'No previous command stored'
2673 else: 2785 else:
2674 doit = f'Command has changed' 2786 doit = f'Command has changed'
2675 if 0: 2787 if 0:
2676 doit += f': {cmd!r} => {command!r}' 2788 doit += f':\n {cmd!r}\n {command!r}'
2789 if 0:
2790 doit += f'\nbefore:\n'
2791 doit += textwrap.indent(cmd, ' ')
2792 doit += f'\nafter:\n'
2793 doit += textwrap.indent(command, ' ')
2794 if 1:
2795 # Show diff based on commands split into pseudo lines by
2796 # shlex.split().
2797 doit += ':\n'
2798 lines = difflib.unified_diff(
2799 cmd.split(),
2800 command.split(),
2801 lineterm='',
2802 )
2803 # Skip initial lines.
2804 assert next(lines) == '--- '
2805 assert next(lines) == '+++ '
2806 for line in lines:
2807 doit += f' {line}\n'
2677 2808
2678 if not doit: 2809 if not doit:
2679 # See whether any prerequisites are newer than target. 2810 # See whether any prerequisites are newer than target.
2680 def _make_prerequisites(p): 2811 def _make_prerequisites(p):
2681 if isinstance( p, (list, tuple)): 2812 if isinstance( p, (list, tuple)):
2684 return [p] 2815 return [p]
2685 prerequisites_all = list() 2816 prerequisites_all = list()
2686 for p in prerequisites: 2817 for p in prerequisites:
2687 prerequisites_all += _make_prerequisites( p) 2818 prerequisites_all += _make_prerequisites( p)
2688 if 0: 2819 if 0:
2689 log2( 'prerequisites_all:') 2820 log2( 'prerequisites_all:', caller=caller+1)
2690 for i in prerequisites_all: 2821 for i in prerequisites_all:
2691 log2( f' {i!r}') 2822 log2( f' {i!r}', caller=caller+1)
2692 pre_mtime = 0 2823 pre_mtime = 0
2693 pre_path = None 2824 pre_path = None
2694 for prerequisite in prerequisites_all: 2825 for prerequisite in prerequisites_all:
2695 if isinstance( prerequisite, str): 2826 if isinstance( prerequisite, str):
2696 mtime = _fs_mtime_newest( prerequisite) 2827 mtime = _fs_mtime_newest( prerequisite)
2702 elif prerequisite: 2833 elif prerequisite:
2703 doit = str(prerequisite) 2834 doit = str(prerequisite)
2704 break 2835 break
2705 if not doit: 2836 if not doit:
2706 if pre_mtime > out_mtime: 2837 if pre_mtime > out_mtime:
2707 doit = f'Prerequisite is new: {pre_path!r}' 2838 doit = f'Prerequisite is new: {os.path.abspath(pre_path)!r}'
2708 2839
2709 if doit: 2840 if doit:
2710 # Remove `cmd_path` before we run the command, so any failure 2841 # Remove `cmd_path` before we run the command, so any failure
2711 # will force rerun next time. 2842 # will force rerun next time.
2712 # 2843 #
2713 try: 2844 try:
2714 os.remove( cmd_path) 2845 os.remove( cmd_path)
2715 except Exception: 2846 except Exception:
2716 pass 2847 pass
2717 log1( f'Running command because: {doit}') 2848 log1( f'Running command because: {doit}', caller=caller+1)
2718 2849
2719 run( command) 2850 run( command, caller=caller+1)
2720 2851
2721 # Write the command we ran, into `cmd_path`. 2852 # Write the command we ran, into `cmd_path`.
2722 with open( cmd_path, 'w') as f: 2853 with open( cmd_path, 'w') as f:
2723 f.write( command) 2854 f.write( command)
2724 return True 2855 return True
2725 else: 2856 else:
2726 log1( f'Not running command because up to date: {out!r}') 2857 log1( f'Not running command because up to date: {out!r}', caller=caller+1)
2727 2858
2728 if 0: 2859 if 0:
2729 log2( f'out_mtime={time.ctime(out_mtime)} pre_mtime={time.ctime(pre_mtime)}.' 2860 log2( f'out_mtime={time.ctime(out_mtime)} pre_mtime={time.ctime(pre_mtime)}.'
2730 f' pre_path={pre_path!r}: returning {ret!r}.' 2861 f' pre_path={pre_path!r}: returning {ret!r}.'
2731 ) 2862 )
2793 def _normalise(name): 2924 def _normalise(name):
2794 # https://packaging.python.org/en/latest/specifications/name-normalization/#name-normalization 2925 # https://packaging.python.org/en/latest/specifications/name-normalization/#name-normalization
2795 return re.sub(r"[-_.]+", "-", name).lower() 2926 return re.sub(r"[-_.]+", "-", name).lower()
2796 2927
2797 2928
2929 def _normalise2(name):
2930 # https://packaging.python.org/en/latest/specifications/binary-distribution-format/
2931 return _normalise(name).replace('-', '_')
2932
2933
2798 def _assert_version_pep_440(version): 2934 def _assert_version_pep_440(version):
2799 assert re.match( 2935 assert re.match(
2800 r'^([1-9][0-9]*!)?(0|[1-9][0-9]*)(\.(0|[1-9][0-9]*))*((a|b|rc)(0|[1-9][0-9]*))?(\.post(0|[1-9][0-9]*))?(\.dev(0|[1-9][0-9]*))?(?:\+([a-z0-9]+(?:[-_\.][a-z0-9]+)*))?$', 2936 r'^([1-9][0-9]*!)?(0|[1-9][0-9]*)(\.(0|[1-9][0-9]*))*((a|b|rc)(0|[1-9][0-9]*))?(\.post(0|[1-9][0-9]*))?(\.dev(0|[1-9][0-9]*))?(?:\+([a-z0-9]+(?:[-_\.][a-z0-9]+)*))?$',
2801 version, 2937 version,
2802 ), \ 2938 ), \
2821 ''' 2957 '''
2822 Sets whether to include line numbers; helps with doctest. 2958 Sets whether to include line numbers; helps with doctest.
2823 ''' 2959 '''
2824 global g_log_line_numbers 2960 global g_log_line_numbers
2825 g_log_line_numbers = bool(yes) 2961 g_log_line_numbers = bool(yes)
2962
2963 def log(text='', caller=1):
2964 _log(text, 0, caller+1)
2826 2965
2827 def log0(text='', caller=1): 2966 def log0(text='', caller=1):
2828 _log(text, 0, caller+1) 2967 _log(text, 0, caller+1)
2829 2968
2830 def log1(text='', caller=1): 2969 def log1(text='', caller=1):
2845 print(f'{filename}:{fr.lineno}:{fr.function}(): {line}', file=sys.stdout, flush=1) 2984 print(f'{filename}:{fr.lineno}:{fr.function}(): {line}', file=sys.stdout, flush=1)
2846 else: 2985 else:
2847 print(f'{filename}:{fr.function}(): {line}', file=sys.stdout, flush=1) 2986 print(f'{filename}:{fr.function}(): {line}', file=sys.stdout, flush=1)
2848 2987
2849 2988
2850 def relpath(path, start=None): 2989 def relpath(path, start=None, allow_up=True):
2851 ''' 2990 '''
2852 A safe alternative to os.path.relpath(), avoiding an exception on Windows 2991 A safe alternative to os.path.relpath(), avoiding an exception on Windows
2853 if the drive needs to change - in this case we use os.path.abspath(). 2992 if the drive needs to change - in this case we use os.path.abspath().
2993
2994 Args:
2995 path:
2996 Path to be processed.
2997 start:
2998 Start directory or current directory if None.
2999 allow_up:
3000 If false we return absolute path is <path> is not within <start>.
2854 ''' 3001 '''
2855 if windows(): 3002 if windows():
2856 try: 3003 try:
2857 return os.path.relpath(path, start) 3004 ret = os.path.relpath(path, start)
2858 except ValueError: 3005 except ValueError:
2859 # os.path.relpath() fails if trying to change drives. 3006 # os.path.relpath() fails if trying to change drives.
2860 return os.path.abspath(path) 3007 ret = os.path.abspath(path)
2861 else: 3008 else:
2862 return os.path.relpath(path, start) 3009 ret = os.path.relpath(path, start)
3010 if not allow_up and ret.startswith('../') or ret.startswith('..\\'):
3011 ret = os.path.abspath(path)
3012 return ret
2863 3013
2864 3014
2865 def _so_suffix(use_so_versioning=True): 3015 def _so_suffix(use_so_versioning=True):
2866 ''' 3016 '''
2867 Filename suffix for shared libraries is defined in pep-3149. The 3017 Filename suffix for shared libraries is defined in pep-3149. The
3013 ret = list() 3163 ret = list()
3014 items = self._items() 3164 items = self._items()
3015 for path, id_ in items.items(): 3165 for path, id_ in items.items():
3016 id0 = self.items0.get(path) 3166 id0 = self.items0.get(path)
3017 if id0 != id_: 3167 if id0 != id_:
3018 #mtime0, hash0 = id0
3019 #mtime1, hash1 = id_
3020 #log0(f'New/modified file {path=}.')
3021 #log0(f' {mtime0=} {"==" if mtime0==mtime1 else "!="} {mtime1=}.')
3022 #log0(f' {hash0=} {"==" if hash0==hash1 else "!="} {hash1=}.')
3023 ret.append(path) 3168 ret.append(path)
3024 return ret 3169 return ret
3170 def get_n(self, n):
3171 '''
3172 Returns new files matching <glob_pattern>, asserting that there are
3173 exactly <n>.
3174 '''
3175 ret = self.get()
3176 assert len(ret) == n, f'{len(ret)=}: {ret}'
3177 return ret
3025 def get_one(self): 3178 def get_one(self):
3026 ''' 3179 '''
3027 Returns new match of <glob_pattern>, asserting that there is exactly 3180 Returns new match of <glob_pattern>, asserting that there is exactly
3028 one. 3181 one.
3029 ''' 3182 '''
3030 ret = self.get() 3183 return self.get_n(1)[0]
3031 assert len(ret) == 1, f'{len(ret)=}'
3032 return ret[0]
3033 def _file_id(self, path): 3184 def _file_id(self, path):
3034 mtime = os.stat(path).st_mtime 3185 mtime = os.stat(path).st_mtime
3035 with open(path, 'rb') as f: 3186 with open(path, 'rb') as f:
3036 content = f.read() 3187 content = f.read()
3037 hash_ = hashlib.md5(content).digest() 3188 hash_ = hashlib.md5(content).digest()
3057 3208
3058 Otherwise we simply return <swig>. 3209 Otherwise we simply return <swig>.
3059 3210
3060 Args: 3211 Args:
3061 swig: 3212 swig:
3062 If starts with 'git:', passed as <remote> arg to git_remote(). 3213 If starts with 'git:', passed as <text> arg to git_get().
3063 quick: 3214 quick:
3064 If true, we do not update/build local checkout if the binary is 3215 If true, we do not update/build local checkout if the binary is
3065 already present. 3216 already present.
3066 swig_local: 3217 swig_local:
3067 path to use for checkout. 3218 path to use for checkout.
3068 ''' 3219 '''
3069 if swig and swig.startswith('git:'): 3220 if swig and swig.startswith('git:'):
3070 assert platform.system() != 'Windows' 3221 assert platform.system() != 'Windows', f'Cannot build swig on Windows.'
3071 swig_local = os.path.abspath(swig_local) 3222 # Note that {swig_local}/install/bin/swig doesn't work on MacOS because
3072 # Note that {swig_local}/install/bin/swig doesn't work on MacoS because
3073 # {swig_local}/INSTALL is a file and the fs is case-insensitive. 3223 # {swig_local}/INSTALL is a file and the fs is case-insensitive.
3074 swig_binary = f'{swig_local}/install-dir/bin/swig' 3224 swig_binary = f'{swig_local}/install-dir/bin/swig'
3075 if quick and os.path.isfile(swig_binary): 3225 if quick and os.path.isfile(swig_binary):
3076 log1(f'{quick=} and {swig_binary=} already exists, so not downloading/building.') 3226 log1(f'{quick=} and {swig_binary=} already exists, so not downloading/building.')
3077 else: 3227 else:
3078 # Clone swig. 3228 # Clone swig.
3079 swig_env_extra = None 3229 swig_env_extra = None
3080 git_get( 3230 swig_local = git_get(
3081 swig,
3082 swig_local, 3231 swig_local,
3083 default_remote='https://github.com/swig/swig.git', 3232 text=swig,
3233 remote='https://github.com/swig/swig.git',
3084 branch='master', 3234 branch='master',
3085 ) 3235 )
3086 if darwin(): 3236 if darwin():
3087 run(f'brew install automake') 3237 run(f'brew install automake')
3088 run(f'brew install pcre2') 3238 run(f'brew install pcre2')
3093 # > parallel can cause all kinds of trouble. 3243 # > parallel can cause all kinds of trouble.
3094 # > 3244 # >
3095 # > If you need to have bison first in your PATH, run: 3245 # > If you need to have bison first in your PATH, run:
3096 # > echo 'export PATH="/opt/homebrew/opt/bison/bin:$PATH"' >> ~/.zshrc 3246 # > echo 'export PATH="/opt/homebrew/opt/bison/bin:$PATH"' >> ~/.zshrc
3097 # 3247 #
3098 run(f'brew install bison') 3248 swig_env_extra = dict()
3099 PATH = os.environ['PATH'] 3249 macos_add_brew_path('bison', swig_env_extra)
3100 PATH = f'/opt/homebrew/opt/bison/bin:{PATH}' 3250 run(f'which bison')
3101 swig_env_extra = dict(PATH=PATH) 3251 run(f'which bison', env_extra=swig_env_extra)
3102 # Build swig. 3252 # Build swig.
3103 run(f'cd {swig_local} && ./autogen.sh', env_extra=swig_env_extra) 3253 run(f'cd {swig_local} && ./autogen.sh', env_extra=swig_env_extra)
3104 run(f'cd {swig_local} && ./configure --prefix={swig_local}/install-dir', env_extra=swig_env_extra) 3254 run(f'cd {swig_local} && ./configure --prefix={swig_local}/install-dir', env_extra=swig_env_extra)
3105 run(f'cd {swig_local} && make', env_extra=swig_env_extra) 3255 run(f'cd {swig_local} && make', env_extra=swig_env_extra)
3106 run(f'cd {swig_local} && make install', env_extra=swig_env_extra) 3256 run(f'cd {swig_local} && make install', env_extra=swig_env_extra)
3107 assert os.path.isfile(swig_binary) 3257 assert os.path.isfile(swig_binary)
3108 return swig_binary 3258 return swig_binary
3109 else: 3259 else:
3110 return swig 3260 return swig
3261
3262
3263 def macos_add_brew_path(package, env=None, gnubin=True):
3264 '''
3265 Adds path(s) for Brew <package>'s binaries to env['PATH'].
3266
3267 Args:
3268 package:
3269 Name of package. We get <package_root> of installed package by
3270 running `brew --prefix <package>`.
3271 env:
3272 The environment dict to modify. If None we use os.environ. If PATH
3273 is not in <env>, we first copy os.environ['PATH'] into <env>.
3274 gnubin:
3275 If true, we also add path to gnu binaries if it exists,
3276 <package_root>/libexe/gnubin.
3277 '''
3278 if not darwin():
3279 return
3280 if env is None:
3281 env = os.environ
3282 if 'PATH' not in env:
3283 env['PATH'] = os.environ['PATH']
3284 package_root = run(f'brew --prefix {package}', capture=1).strip()
3285 def add(path):
3286 if os.path.isdir(path):
3287 log1(f'Adding to $PATH: {path}')
3288 PATH = env['PATH']
3289 env['PATH'] = f'{path}:{PATH}'
3290 add(f'{package_root}/bin')
3291 if gnubin:
3292 add(f'{package_root}/libexec/gnubin')
3111 3293
3112 3294
3113 def _show_dict(d): 3295 def _show_dict(d):
3114 ret = '' 3296 ret = ''
3115 for n in sorted(d.keys()): 3297 for n in sorted(d.keys()):
3151 ldflags_ = f'-L {ldflags_}' 3333 ldflags_ = f'-L {ldflags_}'
3152 includes_ = ' '.join(includes_) 3334 includes_ = ' '.join(includes_)
3153 return includes_, ldflags_ 3335 return includes_, ldflags_
3154 3336
3155 3337
3338 def venv_in(path=None):
3339 '''
3340 If path is None, returns true if we are in a venv. Otherwise returns true
3341 only if we are in venv <path>.
3342 '''
3343 if path:
3344 return os.path.abspath(sys.prefix) == os.path.abspath(path)
3345 else:
3346 return sys.prefix != sys.base_prefix
3347
3348
3349 def venv_run(args, path, recreate=True, clean=False):
3350 '''
3351 Runs Python command inside venv and returns termination code.
3352
3353 Args:
3354 args:
3355 List of args or string command.
3356 path:
3357 Path of venv directory.
3358 recreate:
3359 If false we do not run `<sys.executable> -m venv <path>` if <path>
3360 already exists. This avoids a delay in the common case where <path>
3361 is already set up, but fails if <path> exists but does not contain
3362 a valid venv.
3363 clean:
3364 If true we first delete <path>.
3365 '''
3366 if clean:
3367 log(f'Removing any existing venv {path}.')
3368 assert path.startswith('venv-')
3369 shutil.rmtree(path, ignore_errors=1)
3370 if recreate or not os.path.isdir(path):
3371 run(f'{sys.executable} -m venv {path}')
3372
3373 if isinstance(args, str):
3374 args_string = args
3375 elif platform.system() == 'Windows':
3376 # shlex not reliable on Windows so we use Use crude quoting with "...".
3377 args_string = ''
3378 for i, arg in enumerate(args):
3379 assert '"' not in arg
3380 if i:
3381 args_string += ' '
3382 args_string += f'"{arg}"'
3383 else:
3384 args_string = shlex.join(args)
3385
3386 if platform.system() == 'Windows':
3387 command = f'{path}\\Scripts\\activate && python {args_string}'
3388 else:
3389 command = f'. {path}/bin/activate && python {args_string}'
3390 e = run(command, check=0)
3391 return e
3392
3393
3156 if __name__ == '__main__': 3394 if __name__ == '__main__':
3157 # Internal-only limited command line support, used if 3395 # Internal-only limited command line support, used if
3158 # graal_legacy_python_config is true. 3396 # graal_legacy_python_config is true.
3159 # 3397 #
3160 includes, ldflags = sysconfig_python_flags() 3398 includes, ldflags = sysconfig_python_flags()
3161 if sys.argv[1:] == ['--graal-legacy-python-config', '--includes']: 3399 if sys.argv[1] == '--doctest':
3400 import doctest
3401 if sys.argv[2:]:
3402 for f in sys.argv[2:]:
3403 ff = globals()[f]
3404 doctest.run_docstring_examples(ff, globals())
3405 else:
3406 doctest.testmod(None)
3407 elif sys.argv[1:] == ['--graal-legacy-python-config', '--includes']:
3162 print(includes) 3408 print(includes)
3163 elif sys.argv[1:] == ['--graal-legacy-python-config', '--ldflags']: 3409 elif sys.argv[1:] == ['--graal-legacy-python-config', '--ldflags']:
3164 print(ldflags) 3410 print(ldflags)
3165 else: 3411 else:
3166 assert 0, f'Expected `--graal-legacy-python-config --includes|--ldflags` but {sys.argv=}' 3412 assert 0, f'Expected `--graal-legacy-python-config --includes|--ldflags` but {sys.argv=}'