Comment: | wiki reference upgrade |
---|---|
Downloads: | Tarball | ZIP archive | SQL archive |
Timelines: | family | ancestors | descendants | both | trunk |
Files: | files | file ages | folders |
SHA1: |
c8c4e6985ddb87adfef5ac32262dd96a |
User & Date: | vhost7825ssh on 2017-05-20 10:29:08 |
Other Links: | manifest | tags |
2018-01-22 21:20 | VS temporary rename check-in: e19a814400 user: martin_vahi tags: trunk | |
2017-05-20 10:29 | wiki reference upgrade check-in: c8c4e6985d user: vhost7825ssh tags: trunk | |
2017-05-19 18:47 | wiki reference upgrade check-in: dee8e3e8ea user: vhost7825ssh tags: trunk | |
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Documentation/Add New Repository.md version [84bb4c199f].
1 -## Add a New GitHub Repository - QA Steps 2 - 3 -New GitHub repo created? Then this document should walk you through the QA steps to standardise your repo, alongside all the other MaidSafe GitHub repositories. For steps and tools please use the MaidSafe-QA user unless instructions specify otherwise. 4 - 5 -### Fork the New Repository 6 - 7 -While logged into GitHub under your own account, fork the new repo and clone it locally. 8 - 9 -### Login to GitHub as MaidSafe-QA 10 - 11 -Log out of your own account and log back in as the MaidSafe-QA user. 12 - 13 -*At this stage you need to request temporary GitHub "Admin" privileges from Fraser, Viv or David.* 14 - 15 -### Add Repository to Travis 16 - 17 -Login to [Travis](https://travis-ci.org/), sync account, find the new repository you want to add and flick the switch to on. 18 - 19 - 20 - 21 -### Add Repository to AppVeyor 22 - 23 -Login to [AppVeyor](https://ci.appveyor.com/login) and select `+ NEW PROJECT` 24 - 25 - 26 - 27 -Then select the repository you would like to add 28 - 29 - 30 - 31 -Add appveyor.yml and .travis.yml scripts to new repository. 32 - 33 -From another [MaidSafe GitHub repository](https://github.com/maidsafe), copy and add the `appveyor.yml` and `.travis.yml` files to the root of your newly-forked local clone of the new repository. The `.travis.yml` will require minor tweaking (more of which in the following steps) especially creating and updating the secure token, which is used to upload rust documentation. 34 - 35 -### Give Travis Permissions 36 - 37 -While still logged into GitHub as the MaidSafe-QA user, go to settings and select "Personal access tokens". Now click `Generate new token` and create a new "Travis Deploy Token - <new repo name>" 38 - 39 - 40 - 41 -and limit scopes to `public_repo` as shown below 42 - 43 - 44 - 45 -Once you have clicked on "Generate token", copy the output as you will not see it again. 46 - 47 -[Install Travis gem](https://github.com/travis-ci/travis.rb#installation) to encrypt secure GitHub access 48 - 49 -Run this, where `<YOUR_TOKEN>` is the one we copied in the previous step. 50 - 51 -`travis encrypt -r maidsafe/<new_repo> GH_TOKEN=<YOUR_TOKEN>` 52 - 53 -Edit the `.travis.yml` file you added to the new repo and replace the long string in the line `-secure:` with the output you have just generated - example of what this looks like is below (the string has been shortened in this image). 54 - 55 - 56 - 57 -If you are not at this point going to update the repository's `README.md` then you can push all your local changes upstream and issue a PR to add them to the main repository. 58 - 59 -### Webhooks - Add Highfive 60 - 61 -Go to the project's settings (the `maidsafe` fork - not your fork) *> Settings > Webhooks & services > Add webhook* 62 - 63 -The Payload URL is 64 - 65 -``` 66 -http://visualiser.maidsafe.net/cgi-bin/highfive/newpr.py 67 -``` 68 - 69 - 70 - 71 - 72 - 73 - 74 -### Highfive Backend Configuration 75 - 76 -SSH (details in private assets GitHub repository) to the droplet hosting Highfive 77 - 78 - 79 - 80 -Navigate to `/usr/lib/cgi-bin/highfive/configs/` 81 - 82 - 83 - 84 -create a new `<repository_name>.json` file (copy an existing .json file) 85 - 86 - 87 - 88 -Edit the new `<repository_name>.json` file and update the maintainers' names. 89 - 90 -The important section is "groups" - note that entries & file names are case sensitive. 91 - 92 -### Add Coverage 93 - 94 -Login to [coveralls.io](https://coveralls.io/) using the MaidSafe-QA GitHub account and click `RE-SYNC REPOS` 95 - 96 - 97 - 98 -Click `ADD REPOS` 99 - 100 - 101 - 102 -Flick the switch on your new repository 103 - 104 - 105 - 106 -### Update New Repo's `README.md` 107 - 108 - 109 - 110 -Above is a screenshot and below is a template, best take the markdown from another repository and edit to fit the purposes of the new repository. 111 - 112 -# < repository_name > 113 - 114 -[](http://maidsafe.net/applications) [](https://github.com/maidsafe/crust/blob/master/COPYING) 115 - 116 - 117 -**Primary Maintainer:** < name > (< email_address >) 118 - 119 -**Secondary Maintainer:** < name > (< email_address >) 120 - 121 -Reliable peer-to-peer network connections in Rust with NAT traversal. 122 - 123 -|Crate|Linux/OS X|Windows|Coverage|Issues| 124 -|:---:|:--------:|:-----:|:------:|:----:| 125 -|[](https://crates.io/crates/crust)|[](https://travis-ci.org/maidsafe/crust)|[](https://ci.appveyor.com/project/MaidSafe-QA/crust/branch/master)|[](https://coveralls.io/r/maidsafe/crust)|[](https://waffle.io/maidsafe/crust)| 126 - 127 -|[API Documentation - master branch](http://maidsafe.net/crust/master)|[SAFE Network System Documentation](http://systemdocs.maidsafe.net)|[MaidSafe website](http://maidsafe.net)| [SAFE Network Forum](https://forum.safenetwork.io)| 128 -|:------:|:-------:|:-------:|:-------:| 129 - 130 - 131 -## Overview 132 -< insert_overview > 133 -## Todo Items 134 -< insert_todo_items > 135 - 136 -*In the above example the badges and links are for `crust` just for illustrative purposes* 137 - 138 -One niggle worth noting for AppVeyor badges that has caught a few folk out: you need to grab the markdown for master branch badge - this can be found on the AppVeyor site in the new repo page under: *Settings > Badges* and is the 6th or last entry on the page see below. 139 -This is the one that needs pasted into the project's `README.md` and the QA `README.md` 140 - 141 - 142 - 143 -### Switch On "Build only if .travis.yml / appveyor.yml is present" 144 - 145 -Log into Travis and go to repository *> settings > general settings* and switch `ON` *Build only if .travis.yml is present* setting. 146 - 147 - 148 - 149 -Log into Appveyor and go to repository *> settings > general* and tick the *Do not build tags* , *Skip branches without appveyor.yml* and *Rolling builds* check boxes. 150 - 151 - 152 - 153 - 154 - 155 -### Add Reviewable 156 - 157 -Login to https://reviewable.io using the MaidSafe-QA GitHub account and go to *Repositories* section and toggle to green to enable Reviewable for pull requests. 158 - 159 - 160 - 161 - 162 -### Update QA readme.md 163 - 164 -Finally add a new entry to https://github.com/maidsafe/QA/blob/master/README.md and issue a PR for this. 165 - 166 -### Revoke Github "Admin" from MaidSafe-QA user 167 - 168 -Once everything is complete, we need to revoke elevated privileges and reduce them back to "Write". 169 - 170 -*Ensure `Owners` have "Admin" privileges and `Bots` and `Developers` have "Write" privileges.* 171 - 172 -### Checklist to see if everything is ok: 173 - 174 -* Did Travis run? 175 -* Did AppVeyor run? 176 -* Does Highfive allocate a reviewer for a PR? 177 -* Do all the links and badges go to the correct places? 178 -* On a successful merge to master did Travis create and publish the documentation? 179 -* Did Coverage run?
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Documentation/Installers/Vault/Draft Tests/Linux Process.md version [579123b592].
1 -# Create Package for Vault on Linux 2 - 3 -- [ ] Run the package creation script ` safe_vault/installer/linux/scripts/create_packages.sh` in the `safe_vault` repository 4 -- Check RPM (on e.g. a Fedora test machine) 5 - - Check installer can upgrade an existing version which is running 6 - - [ ] Check test machine has older version already installed and `safe_vault` is running 7 - - [ ] Copy the current bootstrap and config files 8 - - [ ] New installer should run without errors 9 - - [ ] Check new version of `safe_vault` is running and is installed in `/usr/bin/` 10 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 11 - - [ ] Check correct config file(s) are installed to the system cache dir `/var/cache/safe_vault` 12 - - [ ] Check `safe_vault.crust.config` file has `-rw-r--r--` permissions and `safe` owner name and group name 13 - - [ ] Check bootstrap and config files are not present in app support dir `$HOME/.config/safe_vault/` 14 - - Check installer can upgrade an existing version which is not running 15 - - [ ] Check test machine has older version already installed and `safe_vault` is NOT running 16 - - [ ] Copy the current bootstrap and config files 17 - - [ ] New installer should run without errors 18 - - [ ] Check new version of `safe_vault` is running and is installed in `/usr/bin/` 19 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 20 - - [ ] Check correct config file(s) are installed to the system cache dir `/var/cache/safe_vault` 21 - - [ ] Check `safe_vault.crust.config` file has `-rw-r--r--` permissions and `safe` owner name and group name 22 - - [ ] Check bootstrap and config files are not present in app support dir `$HOME/.config/safe_vault/` 23 - - Check installer succeeds on machine with no previous version installed 24 - - [ ] Check test machine has no version already installed 25 - - [ ] Installer should run without errors 26 - - [ ] Check new version of `safe_vault` is running and is installed in `/usr/bin/` 27 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 28 - - [ ] Check correct config file(s) are installed to the system cache dir `/var/cache/safe_vault` 29 - - [ ] Check `safe_vault.crust.config` file has `-rw-r--r--` permissions and `safe` owner name and group name 30 - - [ ] Check bootstrap and config files are not present in app support dir `$HOME/.config/safe_vault/` 31 - - Check repair where current version already installed 32 - - [ ] Kill and remove existing version of `maidsafe_vault` 33 - - [ ] Copy the current bootstrap and config files 34 - - [ ] Installer should rerun without errors 35 - - [ ] Check `safe_vault` is running and is installed in `/usr/bin/` 36 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 37 - - [ ] Check bootstrap and config files haven't been overwritten 38 - - [ ] Remove bootstrap and config files 39 - - [ ] Installer should rerun without errors 40 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 41 - - [ ] Check config file is installed in `/var/cache/safe_vault/` has `-rw-r--r--` permissions and `safe` owner name and `root` group name 42 - - Check uninstall 43 - - [ ] Check `safe_vault` is running 44 - - [ ] Uninstall should run without errors 45 - - [ ] Check `safe_vault` is not running 46 - - [ ] Check `safe_vault`, bootstrap and config files have all been removed 47 - - [ ] Copy installer from slave to yum repository machine 48 - - [ ] Update yum repository 49 - - [ ] Check `yum install safe-vault` works on a clean machine 50 - - [ ] Check `yum update` updates existing version 51 -- Check .deb (on e.g. an Ubuntu test machine) 52 - - Check installer can upgrade an existing version which is running 53 - - [ ] Check test machine has older version already installed and `safe_vault` is running 54 - - [ ] Copy the current bootstrap and config files 55 - - [ ] New installer should run without errors 56 - - [ ] Check new version of `safe_vault` is running and is installed in `/usr/bin/` 57 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 58 - - [ ] Check correct config file(s) are installed to the system cache dir `/var/cache/safe_vault` 59 - - [ ] Check `safe_vault.crust.config` file has `-rw-r--r--` permissions and `safe` owner name and group name 60 - - [ ] Check bootstrap and config files are not present in app support dir `$HOME/.config/safe_vault/` 61 - - Check installer can upgrade an existing version which is not running 62 - - [ ] Check test machine has older version already installed and `safe_vault` is NOT running 63 - - [ ] Copy the current bootstrap and config files 64 - - [ ] New installer should run without errors 65 - - [ ] Check new version of `safe_vault` is running and is installed in `/usr/bin/` 66 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 67 - - [ ] Check correct config file(s) are installed to the system cache dir `/var/cache/safe_vault` 68 - - [ ] Check `safe_vault.crust.config` file has `-rw-r--r--` permissions and `safe` owner name and group name 69 - - [ ] Check bootstrap and config files are not present in app support dir `$HOME/.config/safe_vault/` 70 - - Check installer succeeds on machine with no previous version installed 71 - - [ ] Check test machine has no version already installed 72 - - [ ] Installer should run without errors 73 - - [ ] Check new version of `safe_vault` is running and is installed in `/usr/bin/` 74 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 75 - - [ ] Check correct config file(s) are installed to the system cache dir `/var/cache/safe_vault` 76 - - [ ] Check `safe_vault.crust.config` file has `-rw-r--r--` permissions and `safe` owner name and group name 77 - - [ ] Check bootstrap and config files are not present in app support dir `$HOME/.config/safe_vault/` 78 - - Check repair where current version already installed 79 - - [ ] Kill and remove existing version of `safe_vault` 80 - - [ ] Copy the current bootstrap and config files 81 - - [ ] Installer should rerun without errors 82 - - [ ] Check `safe_vault` is running and is installed in `/usr/bin/` 83 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 84 - - [ ] Check bootstrap and config files haven't been overwritten 85 - - [ ] Remove bootstrap and config files 86 - - [ ] Installer should rerun without errors 87 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 88 - - [ ] Check config file is installed in `/var/cache/safe_vault/` has `-rw-r--r--` permissions and `safe` owner name and `root` group name 89 - - Check uninstall 90 - - [ ] Check `safe_vault` is running 91 - - [ ] Uninstall should run without errors 92 - - [ ] Check `safe_vault` is not running 93 - - [ ] Check `safe_vault`, bootstrap and config files have all been removed 94 - - [ ] Copy installer from slave to apt repository machine 95 - - [ ] Update apt repository 96 - - [ ] Check `apt-get install safe-vault` works on a clean machine 97 - - [ ] Check `apt-get update && apt-get upgrade` updates existing version
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Documentation/Installers/Vault/Draft Tests/OS X Process.md version [d121b73ad8].
1 -# Create Package for Vault on OS X 2 - 3 -- [ ] Run the package creation script safe_vault/installer/osx/scripts/create_packages.sh in the safe_vault repository 4 -- Check installer can upgrade an existing version which is running 5 - - [ ] Check test machine has older version already installed and `safe_vault` is running 6 - - [ ] Copy the current bootstrap and config files 7 - - [ ] New installer should run without errors 8 - - [ ] Check new version of `safe_vault` is running and is installed in `/usr/bin/` 9 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 10 - - [ ] Check correct config file(s) are installed to the system cache dir `/var/cache/safe_vault` 11 - - [ ] Check `safe_vault.crust.config` file has `-rw-r--r--` permissions and `safe` owner name and group name 12 - - [ ] Check bootstrap and config files are not present in app support dir `$HOME/.config/safe_vault/` 13 -- Check installer can upgrade an existing version which is not running 14 - - [ ] Check test machine has older version already installed and `safe_vault` is NOT running 15 - - [ ] Copy the current bootstrap and config files 16 - - [ ] New installer should run without errors 17 - - [ ] Check new version of `safe_vault` is running and is installed in `/usr/bin/` 18 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 19 - - [ ] Check correct config file(s) are installed to the system cache dir `/var/cache/safe_vault` 20 - - [ ] Check `safe_vault.crust.config` file has `-rw-r--r--` permissions and `safe` owner name and group name 21 - - [ ] Check bootstrap and config files are not present in app support dir `$HOME/.config/safe_vault/` 22 -- Check installer succeeds on machine with no previous version installed 23 - - [ ] Check test machine has no version already installed 24 - - [ ] Installer should run without errors 25 - - [ ] Check new version of `safe_vault` is running and is installed in `/usr/bin/` 26 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 27 - - [ ] Check correct config file(s) are installed to the system cache dir `/var/cache/safe_vault` 28 - - [ ] Check `safe_vault.crust.config` file has `-rw-r--r--` permissions and `safe` owner name and group name 29 - - [ ] Check bootstrap and config files are not present in app support dir `$HOME/.config/safe_vault/` 30 -- Check repair where current version already installed 31 - - [ ] Kill and remove existing version of `safe_vault` 32 - - [ ] Copy the current bootstrap and config files 33 - - [ ] Installer should rerun without errors 34 - - [ ] Check `safe_vault` is running and is installed in `/usr/bin/` 35 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 36 - - [ ] Check bootstrap and config files haven't been overwritten 37 - - [ ] Remove bootstrap and config files 38 - - [ ] Installer should rerun without errors 39 - - [ ] Check `safe_vault` has `-rwxr-xr-x` permissions and `safe` owner name and group name 40 - - [ ] Check config file is installed in `/var/cache/safe_vault/` has `-rw-r--r--` permissions and `safe` owner name and `root` group name 41 -- Check uninstall 42 - - [ ] Check `safe_vault` is running 43 - - [ ] Uninstall should run without errors 44 - - [ ] Check `safe_vault` is not running 45 - - [ ] Check `safe_vault`, bootstrap and config files have all been removed 46 -- Check installer can be downloaded 47 - - [ ] Webpage should detect OS and show link to appropriate installer 48 - - [ ] Download installer and hash check it against original 49 - - [ ] Check downloaded filename is meaningful
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Documentation/Installers/Vault/Draft Tests/Windows Process.md version [8e241a7f6a].
1 -# Create Package for Vault on Windows 2 - 3 -- [ ] Run the installer creation script `safe_vault/installer/windows/create_installer.ps1` in the safe_vault repository 4 -- Check installer can upgrade (using default options) an existing version installed to default location which is running 5 - - [ ] Check test machine has older version already installed using default options and `safe_vault.exe` is running 6 - - [ ] Copy the current bootstrap and config files 7 - - [ ] New installer should run without errors 8 - - [ ] Check new version of `safe_vault.exe` is running and is installed in default location 9 - - [ ] Check bootstrap and config files haven't been overwritten 10 -- Check installer can upgrade (using default options) an existing version installed to default location which is not running 11 - - [ ] Check test machine has older version already installed using default options and `safe_vault.exe` is NOT running 12 - - [ ] Copy the current bootstrap and config files 13 - - [ ] New installer should run without errors 14 - - [ ] Check new version of `safe_vault.exe` is running and is installed in default location 15 - - [ ] Check bootstrap and config files haven't been overwritten 16 -- Check installer can upgrade (using default options) an existing version installed to non-default location which is running 17 - - [ ] Check test machine has older version already installed using NON-default options and `safe_vault.exe` is running 18 - - [ ] Copy the current bootstrap and config files 19 - - [ ] New installer should run without errors 20 - - [ ] Check new version of `safe_vault.exe` is running and is installed in default location 21 - - [ ] Check old version of `safe_vault.exe` has been deleted from non-default location 22 - - [ ] Check bootstrap and config files haven't been overwritten 23 -- Check installer succeeds using default options on machine with no previous version installed 24 - - [ ] Check test machine has no version already installed 25 - - [ ] Installer should run without errors 26 - - [ ] Check new version of `safe_vault.exe` is running and is installed in default location 27 - - [ ] Check bootstrap and config files are installed in their default locations 28 -- Check repair where current version installed using defaults 29 - - [ ] Kill and remove existing version of `safe_vault.exe` 30 - - [ ] Copy the current bootstrap and config files 31 - - [ ] Installer should run repair without errors 32 - - [ ] Check `safe_vault.exe` is running and has been re-installed to previous location 33 - - [ ] Check bootstrap and config files haven't been overwritten 34 - - [ ] Remove bootstrap and config files 35 - - [ ] Installer should run repair without errors 36 - - [ ] Check `safe_vault.exe` is running and is installed in previous location 37 -- Check repair where current version installed to non-default location 38 - - [ ] Kill and remove existing version of `safe_vault.exe` 39 - - [ ] Copy the current bootstrap and config files 40 - - [ ] Installer should run repair without errors 41 - - [ ] Check `safe_vault.exe` is running and has been re-installed to previous location 42 - - [ ] Check bootstrap and config files haven't been overwritten 43 - - [ ] Remove bootstrap and config files 44 - - [ ] Installer should run repair without errors 45 - - [ ] Check `safe_vault.exe` is running and is installed in previous location 46 -- Check uninstall where current version installed using defaults 47 - - [ ] Check `safe_vault.exe` is running 48 - - [ ] Uninstall should run without errors 49 - - [ ] Check `safe_vault.exe` is not running 50 - - [ ] Check `safe_vault.exe`, bootstrap and config files have all been removed 51 -- Check uninstall where current version installed to non-default location 52 - - [ ] Check `safe_vault.exe` is running 53 - - [ ] Uninstall should run without errors 54 - - [ ] Check `safe_vault.exe` is not running 55 - - [ ] Check `safe_vault.exe`, bootstrap and config files have all been removed 56 -- [ ] Copy installer from slave to website 57 -- [ ] Update website to link to new installer 58 -- Check installer can be downloaded 59 - - [ ] Webpage should detect OS and show link to appropriate installer 60 - - [ ] Download installer and hash check it against original 61 - - [ ] Check downloaded filename is meaningful 62 - - [ ] Check installer has appropriate high-res icon
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Documentation/Installers/Vault/Update Apt and Yum Repos.md version [bf7f894505].
1 -# Update Apt and Yum Repos 2 - 3 -##### Build and Transfer 32-bit Package 4 - 5 -```sh 6 -ssh maidsafe@178.62.25.205 7 - 8 -rustup update 9 -git -C QA pull 10 - 11 -cd safe_vault 12 -git pull 13 - 14 -~/QA/Bash\ Scripts/create_linux_vault_package.sh 15 -``` 16 - 17 - 18 -##### Build and Transfer 64-bit Package 19 -```sh 20 -ssh maidsafe@178.62.85.248 21 - 22 -rustup update 23 -git -C QA pull 24 - 25 -cd safe_vault 26 -git pull 27 - 28 -~/QA/Bash\ Scripts/create_linux_vault_package.sh 29 -``` 30 - 31 - 32 -##### Update Apt Repo 33 - 34 -```sh 35 -ssh maidsafe@apt.maidsafe.net 36 -Version=$(cat safe_vault_latest_version.txt) 37 -cd /var/www/repos/apt/debian 38 - 39 -# sudo reprepro remove jessie safe-vault 40 -# sudo reprepro remove wheezy safe-vault 41 - 42 -sudo reprepro includedeb jessie ~/SysV-style/safe-vault_"$Version"_amd64.deb 43 -sudo reprepro includedeb jessie ~/SysV-style/safe-vault_"$Version"_i386.deb 44 -sudo reprepro includedeb wheezy ~/SysV-style/safe-vault_"$Version"_amd64.deb 45 -sudo reprepro includedeb wheezy ~/SysV-style/safe-vault_"$Version"_i386.deb 46 - 47 -mv ~/safe_*.tar.gz /var/www/tarballs/ 48 -``` 49 - 50 -##### Update Yum Repo 51 - 52 -```sh 53 -ssh maidsafe@yum.maidsafe.net 54 -cd /var/www/repos 55 -cp ~/SysV-style/* . 56 -rpm --resign *.rpm 57 -createrepo . # need '--checksum sha' for at least CentOS <= 5.10 See http://linux.die.net/man/8/createrepo 58 -gpg2 --detach-sign --armor repodata/repomd.xml 59 -``` 60 - 61 ---- 62 - 63 -##### Apt Links 64 - 65 -- http://www.jejik.com/articles/2006/09/setting_up_and_managing_an_apt_repository_with_reprepro/ 66 -- https://mirrorer.alioth.debian.org/reprepro.1.html 67 -- https://wiki.debian.org/HowToSetupADebianRepository#reprepro_for_new_packages 68 -- https://wiki.debian.org/SettingUpSignedAptRepositoryWithReprepro 69 -- https://scotbofh.wordpress.com/2011/04/26/creating-your-own-signed-apt-repository-and-debian-packages/ 70 - 71 -##### Yum Links 72 - 73 -- http://www.idimmu.net/2009/10/20/creating-a-local-and-http-redhat-yum-repository/ 74 -- http://yum.baseurl.org/wiki/RepoCreate 75 -- http://fedoranews.org/tchung/gpg/ 76 -- https://iuscommunity.org/pages/CreatingAGPGKeyandSigningRPMs.html
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Documentation/Installers/Vault/Windows Installers.md version [1d5cc36384].
1 -# Windows Installers 2 - 3 -On each of the Windows build machines in the office (one 32-bit, one 64-bit, both Windows 7) do the following process: 4 - 5 -- Open C:\MaidSafe\safe_vault\installer\windows\safe_vault_32_and_64_bit.aip in a text editor 6 -- Search for the phrase `Enter path to certificate.p12` and replace it with the actual path to the certificate 7 -- Open a **Powershell** terminal and run the following commands: 8 - 9 -```batch 10 -. rustup update 11 -. "C:\Program Files\Git\bin\git.exe" -C C:\MaidSafe\QA pull 12 - 13 -cd C:\MaidSafe\safe_vault 14 -. "C:\Program Files\Git\bin\git.exe" pull 15 - 16 -. installer\windows\create_installer.ps1 17 - 18 -. "C:\Program Files\Git\bin\git.exe" checkout . 19 -```
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Documentation/Managing Remote Servers.md version [fb67971f79].
1 -# Managing Remote Servers 2 - 3 -The objective of this document is to detail a standard process for handling remote servers (e.g. 4 -Droplets), so that all MaidSafe remote servers are secure and can be accessed in a similar way. 5 -This should make working with and scripting for these simpler. 6 - 7 -Note that this does not apply to "throw-away" remote servers which are used for short-term testing, 8 -and need not be secure. 9 - 10 -### Setting up a New Server 11 - 12 -Where there is a choice, we should never allow the host to send us a root password via email. If a 13 -root or sudo user's password _is_ ever emailed (even internally between two MaidSafe employees), it 14 -should immediately be treated as compromised and changed. 15 - 16 -In the case of Droplets, we should add all QA members' SSH keys by default. This allows any QA 17 -member to ssh into the droplet as root. However, this should generally only ever be done once, in 18 -order to create the new `qa` user as detailed below. Working as root is not a good practice and 19 -should be kept to a minimum. 20 - 21 -As soon as a new server is created, the following steps should be taken: 22 - 23 -1. ssh into the server as root 24 -1. create a sudo user named `qa` with a strong, unique, random password. On Ubuntu: 25 - 26 - ```bash 27 - adduser qa 28 - adduser qa sudo 29 - ``` 30 - 31 - or on Fedora: 32 - 33 - ```bash 34 - useradd qa 35 - passwd qa 36 - usermod qa -a -G wheel 37 - ``` 38 - 39 -1. exit the ssh session 40 -1. add details of the server to an existing or new document in the QA folder of the private 41 -[Assets](https://github.com/maidsafe/Assets/tree/master/QA) repository 42 - 43 -### Managing the Servers 44 - 45 -#### Compromised Password 46 - 47 -If the password of a sudo user is compromised (e.g. laptop lost/stolen, password emailed), all 48 -affected servers should be updated as soon as possible. As passwords should be unique, this should 49 -apply to just a single user account on a single server. 50 - 51 -The fix can either be to change the password, or to delete the user. 52 - 53 -#### Compromised SSH Key 54 - 55 -If the private SSH key of a sudo user is compromised (e.g. laptop lost/stolen, private key 56 -emailed!), all affected servers should be updated as soon as possible. 57 - 58 -The hard part will be identifying all the accounts to which this key has access. For a QA team 59 -member, this will likely include the root user, their own user account and perhaps other users' 60 -accounts on every remote server. 61 - 62 -The fix is to remove the affected key from the relevant `authorized_keys` files. This will be in 63 -`/home/<USER>/.ssh/` or `/root/.ssh/`. 64 - 65 -#### Adding new Users 66 - 67 -If for whatever reason, a non-QA team member wants to access a remote server, don't share 68 -credentials with that member; instead create a new user account for them. Normally, the only shared 69 -account should be the `qa` one (an exception is the `peer1` account on the `peer_prog.maidsafe.net` 70 -Droplet). 71 - 72 -Before creating an account for them, ensure that they really need access to the secure server. If 73 -their work can be done on a non-secure, throw-away Droplet for example, then that is the best 74 -option. 75 - 76 -Don't give the new user sudo access if not required. If sudo access _is_ required, then create the 77 -new user with a strong, unique, random password, but **don't email this password** to the team 78 -member. Instead, send it via a mumble message. 79 - 80 -The team member should be asked to never change the password to a weak one, nor to one which they 81 -use elsewhere. They should also notify QA once the account can be deleted.
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Documentation/Rust Style.md version [7401f55d1a].
1 -# Contributing Rust code to MaidSafe 2 - 3 -We don't maintain a separate style guide but in general try to follow [common good practice](https://aturon.github.io/), write readable and idiomatic code and aim for full test coverage. In addition, this document lists a few decisions we've reached in discussions about specific topics. 4 - 5 -## Rust version 6 - 7 -We currently use Rust stable 1.16.0. 8 - 9 -## Unwrap 10 - 11 -Don't unwrap [`Option`](https://doc.rust-lang.org/std/option/enum.Option.html)s or [`Result`](https://doc.rust-lang.org/std/result/enum.Result.html)s, except possibly when: 12 - 13 -1. locking a mutex, 14 -2. spawning a thread, 15 -3. joining a thread 16 - 17 -or in other patterns where using them makes the code _much simpler_ and it is _obvious at first glance_ to the reader (even one unfamiliar with the code) that the value cannot be `None`/`Err`. 18 - 19 -In these cases, as well as in tests, consider using the macros from the [`unwrap` crate](https://crates.io/crates/unwrap). 20 - 21 -## Threads 22 - 23 -Generally avoid detached threads. Give child threads meaningful names. 24 - 25 -This can easily be achieved by preferring to create child threads using [`maidsafe_utilities::thread::named()`](http://docs.maidsafe.net/maidsafe_utilities/master/maidsafe_utilities/thread/fn.named.html). 26 - 27 -* it returns a [`Joiner`](http://docs.maidsafe.net/maidsafe_utilities/master/maidsafe_utilities/thread/struct.Joiner.html) which helps to avoid detached threads 28 -* it requires that the child thread is given a name 29 - 30 -## Rustfmt 31 - 32 -Apply the latest `rustfmt` to new code before committing, using the default configuration or, if present, the repository's `rustfmt.toml` file. 33 - 34 -## Function ordering 35 - 36 -In `impl`s, always put public functions before private ones. 37 - 38 -## Clippy 39 - 40 -If a crate has that feature, make sure your code does not produce any new errors when compiling with `--features=clippy`. If you don't agree with a [Clippy lint](https://github.com/Manishearth/rust-clippy#lints), discuss it with the team before explicitly adding an `#[allow(lint)]` attribute. 41 - 42 -For clippy, we currently use Clippy 0.0.120 and nightly installed by `rustup install nightly-2017-03-16`: 43 -```rust 44 -rustc --version 45 -rustc 1.17.0-nightly (0aeb9c129 2017-03-15) 46 -``` 47 - 48 -**Note for Windows users:** Due to a recent bug in rustup, you may get a missing dll error when trying to run `cargo clippy`. In this case, you can work around the issue by modifying your `PATH` environment variable: 49 - 50 -``` 51 -setx PATH "%USERPROFILE%\.multirust\toolchains\nightly-2017-03-16-x86_64-pc-windows-gnu\bin;%PATH%" 52 -``` 53 - 54 -## Cargo 55 - 56 -Use `cargo-edit` to update dependencies or keep the `Cargo.toml` in the formatting that `cargo-edit` uses. 57 - 58 -## Other crates 59 - 60 -Adding new dependencies to MaidSafe crates in general should be discussed in the team first, except if other MaidSafe crates already have the same dependency. E.g. [quick-error](https://crates.io/crates/quick-error) and [unwrap](https://crates.io/crates/unwrap) are fine to use. 61 - 62 -## Git Commit Messages 63 - 64 -The first line of the commit message should have the format `<type>/<scope>: <subject>`. For details see the [Leaf project's guidelines](https://github.com/autumnai/leaf/blob/master/CONTRIBUTING.md#git-commit-guidelines).
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Documentation/Update Snapshot Used by Droplet Deployer.md version [d8b1fae71f].
1 -## Update Snapshot Used by Droplet Deployer 2 - 3 -1. Create a new droplet from the existing "Droplet Deployer" [snapshot][0]. 4 -1. Make whatever changes are required (user is `qa`, password is held in [release_config repo][1]). 5 -1. On the droplet, run `sudo rm -rf /root/.ssh/ && sudo shutdown -h now` 6 -1. Once the droplet has shut down, take a new snapshot called `Droplet Deployer`. 7 -1. Replicate [the snapshot][0] to all regions (click the "More" option, then "Add to Region"). 8 -1. Rename the [old snapshot][0] to `Old Droplet Deployer` (check "Created" values). 9 -1. [Generate a new Personal Access Token][2]. 10 -1. To get the ID of the newly-created snapshot, run `curl -sX GET -H "Content-Type: application/json" -H "Authorization: Bearer <token here>" "https://api.digitalocean.com/v2/images?private=true" | sed -n 's/.*"id":\([^,]*\),"name":"Droplet Deployer".*/\n\1\n\n/p'` 11 -1. If this doesn't yield an ID, it may be due to pagination of the response; you may need to add `&page=2` (or whatever value the last page has) to the end of the URL after `private=true`. Alternatively, check that the [new snapshot][0] has finished being created. 12 -1. Replace the existing value of `"imageId"` in [Droplet Deployer's config.json file][3] with the new one. 13 -1. Test the [Droplet Deployer][4] tool. 14 -1. Commit and push the change. 15 -1. [Delete the Personal Access Token][5]. 16 -1. [Delete the `Old Droplet Deployer` snapshot][0]. 17 -1. [Delete the freshly-shutdown Droplet][6] used to create the new snapshot. 18 - 19 - 20 -[0]: https://cloud.digitalocean.com/images/snapshots 21 -[1]: https://github.com/maidsafe/release_config/blob/master/droplets/credentials.json#L3 22 -[2]: https://cloud.digitalocean.com/settings/api/tokens/new 23 -[3]: https://github.com/maidsafe/QA/blob/master/droplet_deployer/config.json#L37 24 -[4]: https://github.com/maidsafe/QA/tree/master/droplet_deployer 25 -[5]: https://cloud.digitalocean.com/settings/api/tokens 26 -[6]: https://cloud.digitalocean.com/droplets
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/Adam.pub version [92584adc12].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDYrVCkGerTtN6QDfZK30PMORBO7Up6Cbg3fikqIaGlLFN+osMn6NjZvfKBXb2JOnlPGRtuzb8KUYl14gtHo/eQ9BT5ASKbKp+LUw6eEmfcaZdd7H3x9GfsbH3+EG9ALm/NPqUBDXNshRq563yfPJMkz4Rk/hcTVURl0E3IPcLHE5ymjCz8Ar8NMdvmWAD7ft/QqoRRG4Bnx3Tc6uSi5s35jHdj66zQlLpoDpZ+IW3z7mk03nE7B8in1quHfNKwRYNIb0vBoV5nKSFwquGpYfB+M0/g1R9a8JRrLeMGv+XkGVGt6Ltja76fxYygZZDP99XrFqw89bEL4mOzrDCGTwDZ adam@higgsboson
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/Andreas.pub version [c2c9b6e8ef].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDUwyrjDbQQhVzXk2mdMLm80/+2eHW1L4fw/flmmee+4FI3WF7b8L1bMjl7TApeMNU+HXc3KxBupkni5LjuXLZOS2L/Zo6yIcrudQpyAb8275phueT3KS36Q4oNLEv+E3IXQiyfNeE8hsvqoFdoo+V9FyR9SFPlDndfUsTC4O/nANWv+jO+1K6Iyd4b5OhZUP+Iw563OtSXFwFGxpgEhz3dUOqL6C0i5M2hxnqdx0FesBowE6uu4Npsjf1KUE/aNcM/+9+loD1PCnQja634V5m6jKy2y121h7n5S/y0gbusoml9Kfe8z30CMwyP4SkHwtBIPG1bf38N08/LUfbr83p7CpIz6wOCcDdY8mx2SsfCoyb0eJcCP7czlqHe70i6F9o77SWWdRX/m25x9bcKug6MUYgVNB5BXbN3nj0RxmitNQ7MpPcs6YD0WxtY8KDh1XZ2a73bie+h/bjN2FqT92AnC9mmZ82YP/v/4l0GI3854dxB5uGGG3m9j1TqYg1I/GVpuqiF7lGRvnR7ip+ahpOVnmaV/pUOQPZGuYps/0hSo5UIo8G1o89nk4eICScwU9h6cSx+MrUjVciPssIadiL4SZ8KU55arkyzXb6zZRhm3MKKBTmB3FIU6/9MW/2N2LcoTurwcC8+wELvGTfYXOIxUVjvpIZ21ZfR7F1n61XRAQ== AndreasFackler@gmx.de
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/Bart.pub version [c76b5d6893].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCnf8sXey8Q5tBPKEAkJAErTIYgluP/NnMpqG86dcSWDbJOXay5PKQh5iXwRLCJ+ZJLvft2a/QGMVXain/yF9wKugUPosFg7dqgQKyFQk0Y3nKK/I4OGyKd3XJtOBVckYow/wEPDLkAWThf2VimDudUbsJ6VPDbAlWBg8NTiDJRaPzohpkru7c/y+yyuFVxmRi4m+1YzM00R12HJr5jqf/qNOZI/pUccNEhMnchFlU7t++Pk0ZhwOgvLEeGfLGfI622HdNVToVNJ7VVxVMr+qyvqBXiIVfIdRVGvoBeoIboTpUxEcYvkgPouxQxkJOSrbxOF/b+3nQ6bff9UTUDL9zf bartek@bartek
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/David.pub version [d28b9b3b21].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAynpCS4y8Yvvd8w1ZeDlXTjjdgTxTsStqNl9lDWxjlwd8dyIWSOfSyJowB0gz1PAS7/gyuz1+RfOP6n3NmJCg1l1TQI6CXt/0HFTp5ucdL5bvfmUM786rOH4jKxQUbw8Mk6p9upVNaEF6R/WyQP2UwPyQgV+wNBIdheR7ytu5YXXmvaE1bCZ3gXbWvhY0PKQYgpX6dVkTJTYvRPFnffw3M99gIFOkk2lvDhuh/GQeeMC+LMml+NskQfiw+oBxKU4ws756HKr0ZlwyrBfH0SmTW+YxXZl5gsnxz32g2wSc7N/jjnJGZ9CAY/7UrARNfXVg7SByNAf38qqwl6TiFtkjyw== dirvine@dirvine-desktop
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/DiggoryHome.pub version [023e024fb8].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDXPPVjQWY1A66cXlZIi7xrmwMa8TPIeYKMX9xWL5rW4IN1FJ0c6smL7qDRKs6eZP5XWYWV+aaeMoV5vBtCEEnA63xnEgQYgiY8UjLxWTY/0owpJWYg6WJNp26b8eKb/5Kwbpy88ETi52mSDTjJY+djfz30SPBOmHRV34Cmpi1paxWxSEzsxblCEU1Hv9WnE/fjt0E1VCKMKS6YGBEFuTRAnfOKIu7wlrbHkB5NaqGTqaj6ChO73TQe77qFnxQOp9Ph2jERaWFwvIZdFH0cD7+WpgmOaSjdzEYUESicqanZSgY2nN23zgMt16rigkuSoUWKhQavHpUFar17tAuQ7HQr dhardy@TPH-L13071
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/DiggoryLaptop.pub version [73f3ce665d].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDN0PnwrwauBsFhuDz1I86Dq/yyteYU+qDYdnYcwyknbx8RrDJ9zzf2rPvFgyQwPPE/HZxXO2jp2nRrUnobucC8nFPFU+owf0mgKkWyT+UD1iVvqT3QHvpKgVzcsM4mSKYoQSf0OymPUNbYRRy01BHdNLXrqHFnC6YshPejuLpijiFsKe0OSQIkjcUffx+Xe/iTFmXHSaZTb23wjTwInBNA7ZofTZCJ94uQRxGXqW0hGqeCr6lw5rL18iomX8IhCFSPZnBzVBET9ll4QLVpadeq35noXy+ArgmCoyS60cPnbX/ZpMDleNgV8ClSzjoE0+N7FPb/7OL3L7ZRCgTqO9Pt dhardy@yoga.dhardy
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/DiggoryWork.pub version [7217746439].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDDbHl0Nu0wXbj0WFVACmG12uQcNduZsgMpLxL0ANoKMvue4VWhhW/nhIK1AIyW+iSvgf1DVQGduWkoeh7SGWN/eHzAqJ2/o4UFbmsl8mL0bcvSakz9xrwhhZQpaK/Vy2N8319cF3uUwujg3SA9S4Q7tu0UKVYA9YF2AN070z5jnJyqK2VVROoWHM48cm/zwHZJBWsqRya7GxpvG70NsyzR+Ap8oe7NKXynZr8bxnQ3JPJr7PsWnnQiiTlzWhjSInoLU1+5xxvnZe0xPhB8K1BBzoOvJDqeI9IrDVGFcxu5PduIyEP9G43swjU/dMuY7Y87WKzHUCU5EMYx4/R5R/I1 dhardy@localhost.localdomain
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/FraserHomeWindows.pub version [4e917b90c0].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDUwgBg3rxrbI/SBm25Q1tIXV9iym7xBIspRtyY3yhxaPBcggVNU63cwfbjXFUCIHAkA3ZZBHAn+4P6uXYqyz0c7ticl9LOfDQm/mPCyZw3gOrGtcI6/xV5dwvYJpOd8pBFS5jIUXto3EG0YOmSqvxIPllHhzd+6IeK/5QKJPNqaEKYXWtgA55iBUq0JqNOWfJx/whJPzOJVdeWeHQjMg++DBrbBFpbLSh3S3qAda88jKBNL9LtOfXK/VJsdJ7/yW1xYeSA3Zu770y60fvzHOUUTpPuvMKqamHKubU54A8/aSzpaHpNIHuFdAfmwKYT3DfeFIR8644+6GTVVd5jVvF7TBg5+lDABcRqruSx6kc4rFxMWzkcHWZA9dXW2B4KP1WrRzSUmXOMWXcbgdZeCMR9QVP3K/AZdBwhXp8LEJXhOlcsEXplGEcp3FrR6SKtut/dOpLur8z/SOTctgmctHrNKJ145Mmu8ws5b1UNRBmVY+CMNvXHw2pXgz1LACaKx3R2dhTouZiGX19eN6V/Qaa+06hizX6ybsBh/zukdTkHtbLzzaMO46RZISFRFZ+zZzLQtenBTSFlR+8V9e5VhfVy8CxQKupLMeeADKoqrGUEGtouYZ1XoAmAAbX2ctO3sSPqeSYusI3F6tVZ38UpcOjwlWUattLXAL8miF7Pbzixdw== Fraser-Home-Windows
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/FraserOfficeLinux.pub version [b5a3047fff].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDeTwVlMND8ubS3U7OmMhxXg0x+pLIr47JV2+Ks86QKR4uGxby2/CcH18lPydwPSmvM2vWuSL1WvHTItX5jmq2BA6guEMY4GBgs3l/2nAR+rN1A5JKgI1T2HgAOL2tRYrAboNIna0KAl1lMMJPsNv26b9PK6w6NhFl/U8qG8iJkv9FbZClvw34UrDw0qpydrGfS/2xikTSXqcjlofZvzUiK0kaD3R5yDqPc3Sz64UhiLKos/gSKQHNbeNc9W/C1Em/DDM8WneVRfmYMbPru2/6DG1F6z4QIaFm/AeyYlRN5PWtVdH6ycg+WB85ZJyjQvN9JtUGdBJ3rvGHALpm5fCxAwmsR6PGI8r4xJVKOMGf3jYkDLdfNgKgKCuQKV4JL7QMMQxCz5HeoMrBjXbQfoTjkQ3Py2C2iz17Aol6BSyYAdZuD2dIEwV0ds81iRfYVTCw+Hd17iUkWoIS2R74EOYfjMkbdkaMz7Dpoqgn6p5FjSrvwHmkQ+b7zXTlWgmAURYMe67gt8ndm16m+/qyFTy0O6AXK2bo2lpxfq68f4bkWQWY7md7YWE7JRaMH+pu/VFfD/mSeNBN8cWljzlC3iSfT6vBnbLxoPsFdX7GZceks9AQvZMgvpWKjMeJmWmdDVhULBSJH1LjLA1/ddmFRoT036FL3he7+b7GYwZ+mR5RTrw== Fraser-Office-Linux
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/Krishna.pub version [1590b2d723].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDXT3Mawk2S1O1sMLSkr/k2E1LERCc9AxSM3SK+asHaNTHPceBFiOyOn+QwePWO8o87lIww4/cNv+lY3aglFfGRyRaJ6mcL0H8Ccoz2AwBdUEbJSY75CJWGZTBFmeL4q3sYU71mdUBYDZwYxWSUSmEmxfATxZG9MZKlvxElCQQXDSDorj/TPPMYaWzhwSl1jhC2wxTrxcU/e9sSm67hBi8hNFxdNlooNhAWYl/pq39/uzRyWrH+lCfq17yuil+1cVQVDs5MF8/caK+jO6mTeHgkO+q+NdEObtijkhQEOZc0+eH0t7/RPdDvUSXe6W9JMYgjFDK4DKn0lFBHPcupjiWSVCVBpbUKbBUHPh75GIN8CYmO/w5VGWgjP5SBQrGtMiPHcFNELDSvcEp5gBQAmjKTbCycD1O6NffejhAcvRMKHMU08EUqHg9phzMbkuh4HUtrTBmf6xYyWLKCzgZSwddt5zRHuPNbca2kH6AFVaCVeuCvNeGlirti6JEVlcxYG1oD2kM0tFKa4UsDuNHmJbEUJW28S5diurXJVpo+iIJ2rLfoCGWnfXYzTyAPXT1t/Wjo3AmJHWym16XGNHmwnjrVXqmLumc+VwOS3xc7nR/utQiH1UZzKlBgUYfXN6pkdq2JNj3awFMLlHSYmAxUlNR7YrpZwZL4nEuRekJOxQFcxQ== krishnaof1988@gmail.com
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/Michael.pub version [9a49a43de4].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCoMq7SaKOpcan3UlMqP9CYSrF1IkNekjQ3BKVV7fF0LTN5lxQ9rOi3knEEuFCvH2VMbYEKFGAieJa3OuZHlBQMfv66LChZCeAiBaG68iRww84DXBiGWDTuUOBmwepOhXfKIO4I2Qr/st3sPWbD4ddprHd7TJdFqpnTmGwG61m4wf0m3jWZygfqxA85UlweUjWsP6DerMVrfG7F+kNYGdpFcDR0CjPKC2cHwGyIhmBI9jhLHfR1k03+qLKLAcPIIjh8+iAep4FELpnPkrC222DmAL7X9KDuYeh+V2GWc/jcaERFzk3xUx59L4Q6YGnLcO2EoRlGiBOITdrut9DBCIjCcyd/MCkHovL+zdmWCqxYT4ITFsOW91a5UlAAStQLRtCkHbprmIaNEsu6mWAW6owTAIAj0u5f5wyBOEkb7BSifPpbg0jN1EqbKnx+YuXN5MvrKmRQzARpJCIGyhJBpvP7Uh+IJHtULoJNbd5XzWN0F6Z+szlIsPUt31NbPLIeLzqqHuW+rmf1Cl/wcEX8BzOnP3PtTH6TfxfwcwP3v4n2HchPdzY9ZJRd+E5zuEAW4hJL3iWtTM5ARWZC2RSk1wCXggbUkhUQxpPS4GpTzmaBiHNirNZUJU0SDnHcsYuEsQditSqrh01ss9Y8HQRYJ0n2Qh/soV4sUCoe5dyGp3SfHw== michael@michael-macbook.local
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/QiLinux.pub version [6de8d8a8d9].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCm34L9WS2OpYeJ8caQ3Z6dNrjIB1AZOCzejPJW33CeYMSN153l0p9pAlLNJPgOET/JcSHv07gOsdTzAqWZvEcuaLhCeX3X/WBXg1ZaqPvxXLsJLW4EtXDuENaQQ18oRpKFBuHULjkI4wopw34JMHWh6WIQrPVOLDcXsX9cfknviCGdlBScHahxB5ZZ9w5wKxdRDFqJEkit8rQlJR5grVrUq9SYb9zWUUBE0/YeULD6wIhrm5bDepfuTuELdhXF1nzUNQb6Kis5lsi9N1jeG5jMDWsP0cLYvUg1zkB4COiiI95ZT7Rwggbvj2/qrHG3P4LhJlXjaZTzyxjxZojMG+Tfjd0su3J+cnMGhkwj++f4CeFVo7Vbox6U5WT8E+UCXVqRcgvCOePdO76EI17bkHshhDef2RDGvBCYrkSy4f6iqCoKXRnPav2buEI+/pQgacfdxz3CeBrhuL1mXETO4BWf/YvDZYiX6L2+NgVcAVJEDXFrDNsMR2zRkqAKL3ysBOhGKJY20MxL6DuWMZv3byT3f8W3wnLDOQgN+k4HNtg/q5hi4a5KwoLPMTat/dD9lAgRpUhcdxh1AhkTmWxc12CrCpbVCc8kyzu4gdZLPE7ZGKP5YtbbHMJw5p2TzLMs9w8ZkB/WycTqZsqyYcHDPM5UDIh18/ncbITEhDyZIX6iHQ== qi.ma@maidsafe.net
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/QiWindows.pub version [4a751e88d0].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDX2xmkt7sgPmh13dG51YC2QMrznFWEHeMCqzUfP96mSh1dRPZB6nOFhEvMvHmJhqy6oyxWYChttPtSzWZf3o68+ur/YkRbFONV3Kn8sP9qfQHDGa7scT9n5EDxTLzGm1yN4RlQDD2bdhVkYmdkfLcdsEYntOi4Zj45N+xMziH1NQou02iwHuJTIHOscCxWyuTbKFYydNw1NWbCOX8AA0lZoqtrYTsZMceQ/AkLkG1N/dCZtQxMbfSBuRM9cbLsDK58n9PI+1c6OflIba2pb8lHiq7ThrZY8CcZolvFYRWlVYMfPysjKiiCQzegNQkGvKrb7r89swr6QAd/wGldqGab qi.ma@maidsafe.net
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/Ross.pub version [78b91b0f6e].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCl0HFTx1cmWG94LxtxYyr0Go4K0kJI3Kd9DU97qPuqrbRnb3Sym8+5C4Xqe4QflqRhluJiWtyZ+XzIiEb0uNGvS2peP7Gb4sdRdfGKFuYg8vfQumv/JhRRn1tw45dOQNDGTAUKFcZmBdpTG8R990LN8991ORSA4jSCzJ3KPbIErhHFI2IknNyURUcopeIu1B3HOwu5WFdC3gWo6XzzgKsenKCQJdlZ1SRSJrHY5L4a4eGTDnkuguE78jx+DpIOJ5UJC1NxfwKOhSG1O34GsBur1lonae5Fx1HwyMRgTmTYGUDNyCo+gqV65y5352wQZrQFc++0YU8cJi3496PQUgWR user@QA-ROSS
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/Spandan.pub version [6966862f9b].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCpG3EOW3aKzk0NHR6dsP4ORd/gUpttGwECd7IRx4mxUkDYM3cROqy0kbT4IJzUri44EGiKDk5EUhuoNUhA1yW4SgqecE+1AbFfBxUHmzJemqSkctjIxZSuYA+R4c3kbeMLAlk+nEcxxZqTBzyPhNQVqhtLlWYqYVVp41y4HSybInHn4q7vkoUsyAqp+taQX5tafEI2VmokMFdUbVsJDUSGxrzIlj5hPxL4kXzMxMcPMCeuxIKBOJsb/+KjrlsHMrfSrMIdM677Qx4ycoCt1hMpndVXECvBPFT7y/CpXdF3xMT5+hFsdrwYsu4uG8ggi+NZUqFjgcW7FJVDAx0CYr9L ustulation@gmail.com
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/Vinicius.pub version [f6ff062577].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDMVwr0GWuka5X+GDUyamKe8zFvUVblNce4/p1j/O9bFnHs9d8N+OkYkz6CkBsXsfJpb0+YYwpYdl55/Lg4ohP5mJjWnMDB0pacooSYLwpJSZnlV0+aJgu4gMMRfpP4amYnBVm80iPkZZ42OC/ZVNW5Hd0yTuAFUtdnwDKgV57Rk6rhT5pGWSPYrchIGWJCQzMHAkNMmmA5xPdRzAKo7tTy3mGqdWJfiyqM1J1NSDi7UgQCm8ehu2rN2/Gs+I2E3N08MQnJUOAcMrxe1X3lgA6kXEnYEWurEq5ZhC3sOXw8erOWmNtXvqI2O6C/rXBQgzlVliNxtubl8yWnmNPX8UXF vinipsmaker@vinipsmaker-netbook
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/Viv.pub version [48138a48c3].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQC/s9I/3XYObaipr54raYtt/f1StRcuybWWCfB03rv0yG1duNrIrpSP8dv1uscmt/OXMvUSLdaGURTZZ8XytD6SFwisVSkTQD4tqqk1NmTQt8DEUZ9KErbQCiyEAAcI02QMJ5IeqvismdyvRnfcNV9Nx2vaaCftppJ2R9rTtm9hNOsa4eoLCnuUACvef6jiAa0Fzn5GV7y91dcrVuWiKnUIdBRtxwv1sJRPT6epm6l6AZcpyA+2Qc4kgS2ak4tAjmSlAWUAIoyYECSweCiIwKJL7WLNSNVV3omhljNLONrckOlfglg7LqUrLYMNh2gHPAdUTCPHFuMlIW4rWvSZi9E0JNTZ7o7+x4PWu+SI8a0faXQ1i8S5qSBhNl3HUbChPH7VxktHrZ4rohOpd4WbV75PrzOoycJwplyuyLzLluWOtE/P+a/EmDV/2iUrlYujQQKHaXhbVIaffI8fct+BuPQAN+EmmMIx/h8BSoeWIBMK/ZdxCcDAuCXeoqonYp3QCFef2+dL8CM5EAjGKkxKHPUcFagf/RsM1VMgb0k3Q30jXqc45k8e5XxsI1cXegRrj6z6ZZmLjPOZrdNxclNDz4xigzZwqf6s9uG+0RxgqCvZZoIJpkfGtGviN6Pm1o8/PPGHI3bmrOv8r/ktjy+V2xjKae6Q5Sw/h83gd1csFoosCQ== viv.rajkumar@maidsafe.net
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/QA/Public Keys/authorized_keys version [6d38c40df4].
1 -ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAynpCS4y8Yvvd8w1ZeDlXTjjdgTxTsStqNl9lDWxjlwd8dyIWSOfSyJowB0gz1PAS7/gyuz1+RfOP6n3NmJCg1l1TQI6CXt/0HFTp5ucdL5bvfmUM786rOH4jKxQUbw8Mk6p9upVNaEF6R/WyQP2UwPyQgV+wNBIdheR7ytu5YXXmvaE1bCZ3gXbWvhY0PKQYgpX6dVkTJTYvRPFnffw3M99gIFOkk2lvDhuh/GQeeMC+LMml+NskQfiw+oBxKU4ws756HKr0ZlwyrBfH0SmTW+YxXZl5gsnxz32g2wSc7N/jjnJGZ9CAY/7UrARNfXVg7SByNAf38qqwl6TiFtkjyw== dirvine@dirvine-desktop 2 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDUwgBg3rxrbI/SBm25Q1tIXV9iym7xBIspRtyY3yhxaPBcggVNU63cwfbjXFUCIHAkA3ZZBHAn+4P6uXYqyz0c7ticl9LOfDQm/mPCyZw3gOrGtcI6/xV5dwvYJpOd8pBFS5jIUXto3EG0YOmSqvxIPllHhzd+6IeK/5QKJPNqaEKYXWtgA55iBUq0JqNOWfJx/whJPzOJVdeWeHQjMg++DBrbBFpbLSh3S3qAda88jKBNL9LtOfXK/VJsdJ7/yW1xYeSA3Zu770y60fvzHOUUTpPuvMKqamHKubU54A8/aSzpaHpNIHuFdAfmwKYT3DfeFIR8644+6GTVVd5jVvF7TBg5+lDABcRqruSx6kc4rFxMWzkcHWZA9dXW2B4KP1WrRzSUmXOMWXcbgdZeCMR9QVP3K/AZdBwhXp8LEJXhOlcsEXplGEcp3FrR6SKtut/dOpLur8z/SOTctgmctHrNKJ145Mmu8ws5b1UNRBmVY+CMNvXHw2pXgz1LACaKx3R2dhTouZiGX19eN6V/Qaa+06hizX6ybsBh/zukdTkHtbLzzaMO46RZISFRFZ+zZzLQtenBTSFlR+8V9e5VhfVy8CxQKupLMeeADKoqrGUEGtouYZ1XoAmAAbX2ctO3sSPqeSYusI3F6tVZ38UpcOjwlWUattLXAL8miF7Pbzixdw== Fraser-Home-Windows 3 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDeTwVlMND8ubS3U7OmMhxXg0x+pLIr47JV2+Ks86QKR4uGxby2/CcH18lPydwPSmvM2vWuSL1WvHTItX5jmq2BA6guEMY4GBgs3l/2nAR+rN1A5JKgI1T2HgAOL2tRYrAboNIna0KAl1lMMJPsNv26b9PK6w6NhFl/U8qG8iJkv9FbZClvw34UrDw0qpydrGfS/2xikTSXqcjlofZvzUiK0kaD3R5yDqPc3Sz64UhiLKos/gSKQHNbeNc9W/C1Em/DDM8WneVRfmYMbPru2/6DG1F6z4QIaFm/AeyYlRN5PWtVdH6ycg+WB85ZJyjQvN9JtUGdBJ3rvGHALpm5fCxAwmsR6PGI8r4xJVKOMGf3jYkDLdfNgKgKCuQKV4JL7QMMQxCz5HeoMrBjXbQfoTjkQ3Py2C2iz17Aol6BSyYAdZuD2dIEwV0ds81iRfYVTCw+Hd17iUkWoIS2R74EOYfjMkbdkaMz7Dpoqgn6p5FjSrvwHmkQ+b7zXTlWgmAURYMe67gt8ndm16m+/qyFTy0O6AXK2bo2lpxfq68f4bkWQWY7md7YWE7JRaMH+pu/VFfD/mSeNBN8cWljzlC3iSfT6vBnbLxoPsFdX7GZceks9AQvZMgvpWKjMeJmWmdDVhULBSJH1LjLA1/ddmFRoT036FL3he7+b7GYwZ+mR5RTrw== Fraser-Office-Linux 4 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDXT3Mawk2S1O1sMLSkr/k2E1LERCc9AxSM3SK+asHaNTHPceBFiOyOn+QwePWO8o87lIww4/cNv+lY3aglFfGRyRaJ6mcL0H8Ccoz2AwBdUEbJSY75CJWGZTBFmeL4q3sYU71mdUBYDZwYxWSUSmEmxfATxZG9MZKlvxElCQQXDSDorj/TPPMYaWzhwSl1jhC2wxTrxcU/e9sSm67hBi8hNFxdNlooNhAWYl/pq39/uzRyWrH+lCfq17yuil+1cVQVDs5MF8/caK+jO6mTeHgkO+q+NdEObtijkhQEOZc0+eH0t7/RPdDvUSXe6W9JMYgjFDK4DKn0lFBHPcupjiWSVCVBpbUKbBUHPh75GIN8CYmO/w5VGWgjP5SBQrGtMiPHcFNELDSvcEp5gBQAmjKTbCycD1O6NffejhAcvRMKHMU08EUqHg9phzMbkuh4HUtrTBmf6xYyWLKCzgZSwddt5zRHuPNbca2kH6AFVaCVeuCvNeGlirti6JEVlcxYG1oD2kM0tFKa4UsDuNHmJbEUJW28S5diurXJVpo+iIJ2rLfoCGWnfXYzTyAPXT1t/Wjo3AmJHWym16XGNHmwnjrVXqmLumc+VwOS3xc7nR/utQiH1UZzKlBgUYfXN6pkdq2JNj3awFMLlHSYmAxUlNR7YrpZwZL4nEuRekJOxQFcxQ== krishnaof1988@gmail.com 5 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCwTp0j1PVpCCi8L6OV0VzCl8tP8eyRBM/eBuud+uEjna6HtpEsvtnTzQmp0Tqx62ktGFKEYqKL/F9m0gNgP1nBC6LqExNXkR7+YVXRNgAoF1J8JF+zdIBOyTaGcFqB1R8/1iL7Aybl8u+eS0wM2I++kgAi5npRQDmNgA/b5AotoSsSwgIatmq6c4PY0wiNr9NF9C58VFHiw+p4IIFO1Jfnx3pkSjaL/DmXvawwbeOit/ik4V7ESvM5Ioao2F1Gydim8DEIKfH/r8FHpaE4TlwuIuveP/Fcz9iS5K/pqVNEQlvwLAyrYrjwOc01JRKQE1q1oF6aaryd2UjzbqtKN2Xt qi.ma@maidsafe.net 6 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCl0HFTx1cmWG94LxtxYyr0Go4K0kJI3Kd9DU97qPuqrbRnb3Sym8+5C4Xqe4QflqRhluJiWtyZ+XzIiEb0uNGvS2peP7Gb4sdRdfGKFuYg8vfQumv/JhRRn1tw45dOQNDGTAUKFcZmBdpTG8R990LN8991ORSA4jSCzJ3KPbIErhHFI2IknNyURUcopeIu1B3HOwu5WFdC3gWo6XzzgKsenKCQJdlZ1SRSJrHY5L4a4eGTDnkuguE78jx+DpIOJ5UJC1NxfwKOhSG1O34GsBur1lonae5Fx1HwyMRgTmTYGUDNyCo+gqV65y5352wQZrQFc++0YU8cJi3496PQUgWR user@QA-ROSS 7 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCpG3EOW3aKzk0NHR6dsP4ORd/gUpttGwECd7IRx4mxUkDYM3cROqy0kbT4IJzUri44EGiKDk5EUhuoNUhA1yW4SgqecE+1AbFfBxUHmzJemqSkctjIxZSuYA+R4c3kbeMLAlk+nEcxxZqTBzyPhNQVqhtLlWYqYVVp41y4HSybInHn4q7vkoUsyAqp+taQX5tafEI2VmokMFdUbVsJDUSGxrzIlj5hPxL4kXzMxMcPMCeuxIKBOJsb/+KjrlsHMrfSrMIdM677Qx4ycoCt1hMpndVXECvBPFT7y/CpXdF3xMT5+hFsdrwYsu4uG8ggi+NZUqFjgcW7FJVDAx0CYr9L ustulation@gmail.com 8 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDMVwr0GWuka5X+GDUyamKe8zFvUVblNce4/p1j/O9bFnHs9d8N+OkYkz6CkBsXsfJpb0+YYwpYdl55/Lg4ohP5mJjWnMDB0pacooSYLwpJSZnlV0+aJgu4gMMRfpP4amYnBVm80iPkZZ42OC/ZVNW5Hd0yTuAFUtdnwDKgV57Rk6rhT5pGWSPYrchIGWJCQzMHAkNMmmA5xPdRzAKo7tTy3mGqdWJfiyqM1J1NSDi7UgQCm8ehu2rN2/Gs+I2E3N08MQnJUOAcMrxe1X3lgA6kXEnYEWurEq5ZhC3sOXw8erOWmNtXvqI2O6C/rXBQgzlVliNxtubl8yWnmNPX8UXF vinipsmaker@vinipsmaker-netbook 9 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQC/s9I/3XYObaipr54raYtt/f1StRcuybWWCfB03rv0yG1duNrIrpSP8dv1uscmt/OXMvUSLdaGURTZZ8XytD6SFwisVSkTQD4tqqk1NmTQt8DEUZ9KErbQCiyEAAcI02QMJ5IeqvismdyvRnfcNV9Nx2vaaCftppJ2R9rTtm9hNOsa4eoLCnuUACvef6jiAa0Fzn5GV7y91dcrVuWiKnUIdBRtxwv1sJRPT6epm6l6AZcpyA+2Qc4kgS2ak4tAjmSlAWUAIoyYECSweCiIwKJL7WLNSNVV3omhljNLONrckOlfglg7LqUrLYMNh2gHPAdUTCPHFuMlIW4rWvSZi9E0JNTZ7o7+x4PWu+SI8a0faXQ1i8S5qSBhNl3HUbChPH7VxktHrZ4rohOpd4WbV75PrzOoycJwplyuyLzLluWOtE/P+a/EmDV/2iUrlYujQQKHaXhbVIaffI8fct+BuPQAN+EmmMIx/h8BSoeWIBMK/ZdxCcDAuCXeoqonYp3QCFef2+dL8CM5EAjGKkxKHPUcFagf/RsM1VMgb0k3Q30jXqc45k8e5XxsI1cXegRrj6z6ZZmLjPOZrdNxclNDz4xigzZwqf6s9uG+0RxgqCvZZoIJpkfGtGviN6Pm1o8/PPGHI3bmrOv8r/ktjy+V2xjKae6Q5Sw/h83gd1csFoosCQ== viv.rajkumar@maidsafe.net 10 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDUwyrjDbQQhVzXk2mdMLm80/+2eHW1L4fw/flmmee+4FI3WF7b8L1bMjl7TApeMNU+HXc3KxBupkni5LjuXLZOS2L/Zo6yIcrudQpyAb8275phueT3KS36Q4oNLEv+E3IXQiyfNeE8hsvqoFdoo+V9FyR9SFPlDndfUsTC4O/nANWv+jO+1K6Iyd4b5OhZUP+Iw563OtSXFwFGxpgEhz3dUOqL6C0i5M2hxnqdx0FesBowE6uu4Npsjf1KUE/aNcM/+9+loD1PCnQja634V5m6jKy2y121h7n5S/y0gbusoml9Kfe8z30CMwyP4SkHwtBIPG1bf38N08/LUfbr83p7CpIz6wOCcDdY8mx2SsfCoyb0eJcCP7czlqHe70i6F9o77SWWdRX/m25x9bcKug6MUYgVNB5BXbN3nj0RxmitNQ7MpPcs6YD0WxtY8KDh1XZ2a73bie+h/bjN2FqT92AnC9mmZ82YP/v/4l0GI3854dxB5uGGG3m9j1TqYg1I/GVpuqiF7lGRvnR7ip+ahpOVnmaV/pUOQPZGuYps/0hSo5UIo8G1o89nk4eICScwU9h6cSx+MrUjVciPssIadiL4SZ8KU55arkyzXb6zZRhm3MKKBTmB3FIU6/9MW/2N2LcoTurwcC8+wELvGTfYXOIxUVjvpIZ21ZfR7F1n61XRAQ== AndreasFackler@gmx.de 11 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDYrVCkGerTtN6QDfZK30PMORBO7Up6Cbg3fikqIaGlLFN+osMn6NjZvfKBXb2JOnlPGRtuzb8KUYl14gtHo/eQ9BT5ASKbKp+LUw6eEmfcaZdd7H3x9GfsbH3+EG9ALm/NPqUBDXNshRq563yfPJMkz4Rk/hcTVURl0E3IPcLHE5ymjCz8Ar8NMdvmWAD7ft/QqoRRG4Bnx3Tc6uSi5s35jHdj66zQlLpoDpZ+IW3z7mk03nE7B8in1quHfNKwRYNIb0vBoV5nKSFwquGpYfB+M0/g1R9a8JRrLeMGv+XkGVGt6Ltja76fxYygZZDP99XrFqw89bEL4mOzrDCGTwDZ adam@higgsboson 12 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCnf8sXey8Q5tBPKEAkJAErTIYgluP/NnMpqG86dcSWDbJOXay5PKQh5iXwRLCJ+ZJLvft2a/QGMVXain/yF9wKugUPosFg7dqgQKyFQk0Y3nKK/I4OGyKd3XJtOBVckYow/wEPDLkAWThf2VimDudUbsJ6VPDbAlWBg8NTiDJRaPzohpkru7c/y+yyuFVxmRi4m+1YzM00R12HJr5jqf/qNOZI/pUccNEhMnchFlU7t++Pk0ZhwOgvLEeGfLGfI622HdNVToVNJ7VVxVMr+qyvqBXiIVfIdRVGvoBeoIboTpUxEcYvkgPouxQxkJOSrbxOF/b+3nQ6bff9UTUDL9zf bart@home 13 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCm34L9WS2OpYeJ8caQ3Z6dNrjIB1AZOCzejPJW33CeYMSN153l0p9pAlLNJPgOET/JcSHv07gOsdTzAqWZvEcuaLhCeX3X/WBXg1ZaqPvxXLsJLW4EtXDuENaQQ18oRpKFBuHULjkI4wopw34JMHWh6WIQrPVOLDcXsX9cfknviCGdlBScHahxB5ZZ9w5wKxdRDFqJEkit8rQlJR5grVrUq9SYb9zWUUBE0/YeULD6wIhrm5bDepfuTuELdhXF1nzUNQb6Kis5lsi9N1jeG5jMDWsP0cLYvUg1zkB4COiiI95ZT7Rwggbvj2/qrHG3P4LhJlXjaZTzyxjxZojMG+Tfjd0su3J+cnMGhkwj++f4CeFVo7Vbox6U5WT8E+UCXVqRcgvCOePdO76EI17bkHshhDef2RDGvBCYrkSy4f6iqCoKXRnPav2buEI+/pQgacfdxz3CeBrhuL1mXETO4BWf/YvDZYiX6L2+NgVcAVJEDXFrDNsMR2zRkqAKL3ysBOhGKJY20MxL6DuWMZv3byT3f8W3wnLDOQgN+k4HNtg/q5hi4a5KwoLPMTat/dD9lAgRpUhcdxh1AhkTmWxc12CrCpbVCc8kyzu4gdZLPE7ZGKP5YtbbHMJw5p2TzLMs9w8ZkB/WycTqZsqyYcHDPM5UDIh18/ncbITEhDyZIX6iHQ== qi.ma@maidsafe.net 14 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDX2xmkt7sgPmh13dG51YC2QMrznFWEHeMCqzUfP96mSh1dRPZB6nOFhEvMvHmJhqy6oyxWYChttPtSzWZf3o68+ur/YkRbFONV3Kn8sP9qfQHDGa7scT9n5EDxTLzGm1yN4RlQDD2bdhVkYmdkfLcdsEYntOi4Zj45N+xMziH1NQou02iwHuJTIHOscCxWyuTbKFYydNw1NWbCOX8AA0lZoqtrYTsZMceQ/AkLkG1N/dCZtQxMbfSBuRM9cbLsDK58n9PI+1c6OflIba2pb8lHiq7ThrZY8CcZolvFYRWlVYMfPysjKiiCQzegNQkGvKrb7r89swr6QAd/wGldqGab qi.ma@maidsafe.net 15 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDXPPVjQWY1A66cXlZIi7xrmwMa8TPIeYKMX9xWL5rW4IN1FJ0c6smL7qDRKs6eZP5XWYWV+aaeMoV5vBtCEEnA63xnEgQYgiY8UjLxWTY/0owpJWYg6WJNp26b8eKb/5Kwbpy88ETi52mSDTjJY+djfz30SPBOmHRV34Cmpi1paxWxSEzsxblCEU1Hv9WnE/fjt0E1VCKMKS6YGBEFuTRAnfOKIu7wlrbHkB5NaqGTqaj6ChO73TQe77qFnxQOp9Ph2jERaWFwvIZdFH0cD7+WpgmOaSjdzEYUESicqanZSgY2nN23zgMt16rigkuSoUWKhQavHpUFar17tAuQ7HQr dhardy@TPH-L13071 16 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDN0PnwrwauBsFhuDz1I86Dq/yyteYU+qDYdnYcwyknbx8RrDJ9zzf2rPvFgyQwPPE/HZxXO2jp2nRrUnobucC8nFPFU+owf0mgKkWyT+UD1iVvqT3QHvpKgVzcsM4mSKYoQSf0OymPUNbYRRy01BHdNLXrqHFnC6YshPejuLpijiFsKe0OSQIkjcUffx+Xe/iTFmXHSaZTb23wjTwInBNA7ZofTZCJ94uQRxGXqW0hGqeCr6lw5rL18iomX8IhCFSPZnBzVBET9ll4QLVpadeq35noXy+ArgmCoyS60cPnbX/ZpMDleNgV8ClSzjoE0+N7FPb/7OL3L7ZRCgTqO9Pt dhardy@yoga.dhardy 17 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDDbHl0Nu0wXbj0WFVACmG12uQcNduZsgMpLxL0ANoKMvue4VWhhW/nhIK1AIyW+iSvgf1DVQGduWkoeh7SGWN/eHzAqJ2/o4UFbmsl8mL0bcvSakz9xrwhhZQpaK/Vy2N8319cF3uUwujg3SA9S4Q7tu0UKVYA9YF2AN070z5jnJyqK2VVROoWHM48cm/zwHZJBWsqRya7GxpvG70NsyzR+Ap8oe7NKXynZr8bxnQ3JPJr7PsWnnQiiTlzWhjSInoLU1+5xxvnZe0xPhB8K1BBzoOvJDqeI9IrDVGFcxu5PduIyEP9G43swjU/dMuY7Y87WKzHUCU5EMYx4/R5R/I1 dhardy@localhost.localdomain 18 -ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCoMq7SaKOpcan3UlMqP9CYSrF1IkNekjQ3BKVV7fF0LTN5lxQ9rOi3knEEuFCvH2VMbYEKFGAieJa3OuZHlBQMfv66LChZCeAiBaG68iRww84DXBiGWDTuUOBmwepOhXfKIO4I2Qr/st3sPWbD4ddprHd7TJdFqpnTmGwG61m4wf0m3jWZygfqxA85UlweUjWsP6DerMVrfG7F+kNYGdpFcDR0CjPKC2cHwGyIhmBI9jhLHfR1k03+qLKLAcPIIjh8+iAep4FELpnPkrC222DmAL7X9KDuYeh+V2GWc/jcaERFzk3xUx59L4Q6YGnLcO2EoRlGiBOITdrut9DBCIjCcyd/MCkHovL+zdmWCqxYT4ITFsOW91a5UlAAStQLRtCkHbprmIaNEsu6mWAW6owTAIAj0u5f5wyBOEkb7BSifPpbg0jN1EqbKnx+YuXN5MvrKmRQzARpJCIGyhJBpvP7Uh+IJHtULoJNbd5XzWN0F6Z+szlIsPUt31NbPLIeLzqqHuW+rmf1Cl/wcEX8BzOnP3PtTH6TfxfwcwP3v4n2HchPdzY9ZJRd+E5zuEAW4hJL3iWtTM5ARWZC2RSk1wCXggbUkhUQxpPS4GpTzmaBiHNirNZUJU0SDnHcsYuEsQditSqrh01ss9Y8HQRYJ0n2Qh/soV4sUCoe5dyGp3SfHw== michael@michael-macbook.local
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/Whitepapers/technical_papers/img/safecoin farming speed.png version [590e37927e].
cannot compute difference between binary files
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/Whitepapers/technical_papers/img/safecoin resources.png version [ab4b3bd4f0].
cannot compute difference between binary files
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/Whitepapers/technical_papers/img/safecoin transfer mech.png version [0940833af9].
cannot compute difference between binary files
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/Whitepapers/technical_papers/safecoin citations.bib version [c35882d830].
1 -%% This BibTeX bibliography file was created using BibDesk. 2 -%% http://bibdesk.sourceforge.net/ 3 - 4 - 5 -%% Created for Nick Lambert at 2015-01-07 10:11:39 +0000 6 - 7 - 8 -%% Saved with string encoding Unicode (UTF-8) 9 - 10 - 11 - 12 -@jurthesis{19, 13 - Date-Added = {2015-01-07 10:07:00 +0000}, 14 - Date-Modified = {2015-01-07 10:11:29 +0000}, 15 - Lastchecked = {7}, 16 - Month = {January}, 17 - Title = {Kademlia wikipedia page}, 18 - Url = {http://en.wikipedia.org/wiki/Kademlia}, 19 - Year = {2015}} 20 - 21 -@webpage{18, 22 - Author = {John Aziz}, 23 - Date-Added = {2014-12-11 16:38:39 +0000}, 24 - Date-Modified = {2014-12-11 16:40:04 +0000}, 25 - Lastchecked = {11}, 26 - Month = {December}, 27 - Title = {Does the Federal Reserve really control the money supply?}, 28 - Url = {http://theweek.com/article/index/244899/does-the-federal-reserve-really-control-the-money-supply}, 29 - Year = {2014}, 30 - Bdsk-Url-1 = {http://theweek.com/article/index/244899/does-the-federal-reserve-really-control-the-money-supply}} 31 - 32 -@webpage{17, 33 - Author = {Paul Krugman}, 34 - Date-Added = {2014-12-11 15:08:47 +0000}, 35 - Date-Modified = {2014-12-11 15:10:58 +0000}, 36 - Lastchecked = {11}, 37 - Month = {December}, 38 - Title = {The textbook economics of cap-and-trade}, 39 - Url = {http://krugman.blogs.nytimes.com/2009/09/27/the-textbook-economics-of-cap-and-trade/?_r=0}, 40 - Year = {2014}, 41 - Bdsk-Url-1 = {http://krugman.blogs.nytimes.com/2009/09/27/the-textbook-economics-of-cap-and-trade/?_r=0}} 42 - 43 -@webpage{16, 44 - Author = {The Atlantic}, 45 - Date-Added = {2014-11-28 11:03:07 +0000}, 46 - Date-Modified = {2014-11-28 11:03:45 +0000}, 47 - Lastchecked = {28}, 48 - Month = {November}, 49 - Title = {The Internet's Original Sin}, 50 - Url = {http://www.theatlantic.com/technology/archive/2014/08/advertising-is-the-internets-original-sin/376041/}, 51 - Year = {2014}, 52 - Bdsk-Url-1 = {http://www.theatlantic.com/technology/archive/2014/08/advertising-is-the-internets-original-sin/376041/}} 53 - 54 -@webpage{15, 55 - Author = {Facebook Inc}, 56 - Date-Added = {2014-11-28 11:00:05 +0000}, 57 - Date-Modified = {2014-11-28 11:00:53 +0000}, 58 - Lastchecked = {28}, 59 - Month = {November}, 60 - Title = {Facebook Reports Fourth Quarter and Full Year 2013 Results}, 61 - Url = {http://investor.fb.com/releasedetail.cfm?ReleaseID=821954}, 62 - Year = {2014}, 63 - Bdsk-Url-1 = {http://investor.fb.com/releasedetail.cfm?ReleaseID=821954}} 64 - 65 -@jurthesis{14, 66 - Author = {Google Inc}, 67 - Date-Added = {2014-11-28 10:58:41 +0000}, 68 - Date-Modified = {2014-12-11 16:48:12 +0000}, 69 - Lastchecked = {28}, 70 - Month = {November}, 71 - Title = {2013 Financial Tables}, 72 - Url = {https://investor.google.com/financial/2013/tables.html}, 73 - Year = {2014}, 74 - Bdsk-Url-1 = {https://investor.google.com/financial/2013/tables.html}} 75 - 76 -@webpage{13, 77 - Author = {Joe McCann}, 78 - Date-Added = {2014-11-28 10:55:50 +0000}, 79 - Date-Modified = {2014-11-28 11:01:03 +0000}, 80 - Lastchecked = {28}, 81 - Month = {November}, 82 - Title = {Data Is The Most Valuable Commodity On Earth}, 83 - Url = {http://subprint.com/blog/data-is-the-most-valuable-commodity-on-earth}, 84 - Year = {2014}, 85 - Bdsk-Url-1 = {http://subprint.com/blog/data-is-the-most-valuable-commodity-on-earth}} 86 - 87 -@webpage{12, 88 - Author = {World Economic Forum}, 89 - Date-Added = {2014-11-28 10:51:45 +0000}, 90 - Date-Modified = {2014-11-28 10:52:51 +0000}, 91 - Lastchecked = {28}, 92 - Month = {November}, 93 - Title = {Personal Data: The Emergence of a New Asset Class}, 94 - Url = {http://www3.weforum.org/docs/WEF_ITTC_PersonalDataNewAsset_Report_2011.pdf}, 95 - Year = {2014}, 96 - Bdsk-Url-1 = {http://www3.weforum.org/docs/WEF_ITTC_PersonalDataNewAsset_Report_2011.pdf}} 97 - 98 -@webpage{11, 99 - Author = {BBC News Web Page}, 100 - Date-Added = {2014-11-28 10:36:05 +0000}, 101 - Date-Modified = {2014-11-28 10:36:58 +0000}, 102 - Lastchecked = {28}, 103 - Month = {November}, 104 - Title = {Gold v paper money}, 105 - Url = {http://www.bbc.co.uk/news/business-18644230}, 106 - Year = {2014}, 107 - Bdsk-Url-1 = {http://www.bbc.co.uk/news/business-18644230}} 108 - 109 -@webpage{10, 110 - Date-Added = {2014-11-28 10:34:17 +0000}, 111 - Date-Modified = {2014-11-28 10:35:07 +0000}, 112 - Lastchecked = {28}, 113 - Month = {November}, 114 - Title = {ECR Research Web Page}, 115 - Url = {http://www.ecrresearch.com/world-economy/dangers-and-drawbacks-quantitative-easing}, 116 - Year = {2014}, 117 - Bdsk-Url-1 = {http://www.ecrresearch.com/world-economy/dangers-and-drawbacks-quantitative-easing}} 118 - 119 -@webpage{9, 120 - Date-Added = {2014-11-28 10:31:55 +0000}, 121 - Date-Modified = {2014-11-28 10:32:47 +0000}, 122 - Lastchecked = {28}, 123 - Month = {November}, 124 - Title = {Federal Reserve Web Site}, 125 - Url = {http://www.federalreserve.gov/faqs/currency_12773.htm}, 126 - Year = {2014}, 127 - Bdsk-Url-1 = {http://www.federalreserve.gov/faqs/currency_12773.htm}} 128 - 129 -@webpage{8, 130 - Date-Added = {2014-11-28 10:29:03 +0000}, 131 - Date-Modified = {2014-11-28 11:01:10 +0000}, 132 - Lastchecked = {28}, 133 - Month = {November}, 134 - Title = {Bountify Web Page}, 135 - Url = {https://bountify.co/}, 136 - Year = {2014}, 137 - Bdsk-Url-1 = {https://bountify.co/}} 138 - 139 -@webpage{7, 140 - Date-Added = {2014-11-28 10:27:49 +0000}, 141 - Date-Modified = {2014-11-28 10:28:30 +0000}, 142 - Lastchecked = {28}, 143 - Month = {November}, 144 - Title = {Bounty Source Web Page}, 145 - Url = {https://www.bountysource.com/}, 146 - Year = {2014}, 147 - Bdsk-Url-1 = {https://www.bountysource.com/}} 148 - 149 -@webpage{6, 150 - Date-Added = {2014-11-28 10:25:36 +0000}, 151 - Date-Modified = {2014-11-28 11:01:22 +0000}, 152 - Lastchecked = {28}, 153 - Month = {November}, 154 - Title = {MaidSafe Wikipedia}, 155 - Url = {http://en.wikipedia.org/wiki/MaidSafe}, 156 - Year = {2014}, 157 - Bdsk-Url-1 = {http://en.wikipedia.org/wiki/MaidSafe}} 158 - 159 -@webpage{5, 160 - Date-Added = {2014-11-28 10:23:00 +0000}, 161 - Date-Modified = {2014-11-28 10:24:14 +0000}, 162 - Lastchecked = {28}, 163 - Month = {November}, 164 - Title = {Tor Incentives Roundup}, 165 - Url = {https://blog.torproject.org/blog/tor-incentives-research-roundup-goldstar-par-braids-lira-tears-and-torcoin}, 166 - Year = {2014}, 167 - Bdsk-Url-1 = {https://blog.torproject.org/blog/tor-incentives-research-roundup-goldstar-par-braids-lira-tears-and-torcoin}} 168 - 169 -@webpage{4, 170 - Date-Added = {2014-11-27 16:52:58 +0000}, 171 - Date-Modified = {2014-11-28 11:01:57 +0000}, 172 - Lastchecked = {27}, 173 - Month = {November}, 174 - Title = {Tor Metrics --- Direct users by country}, 175 - Url = {https://metrics.torproject.org/userstats-relay-country.html}, 176 - Year = {2014}, 177 - Bdsk-Url-1 = {https://metrics.torproject.org/userstats-relay-country.html}} 178 - 179 -@webpage{3, 180 - Date-Added = {2014-11-27 16:49:37 +0000}, 181 - Date-Modified = {2014-11-27 16:51:52 +0000}, 182 - Lastchecked = {27}, 183 - Month = {November}, 184 - Title = {Tor Metrics --- Relays and bridges in the network}, 185 - Url = {https://metrics.torproject.org/networksize.html}, 186 - Year = {2014}, 187 - Bdsk-Url-1 = {https://metrics.torproject.org/networksize.html}} 188 - 189 -@url{2, 190 - Author = {Christopher Doll, T. F. McLaughlin, Anjali Barretto}, 191 - Date-Added = {2014-11-27 16:29:54 +0000}, 192 - Date-Modified = {2015-01-06 10:07:32 +0000}, 193 - Journal = {The International Journal of Basic and Applied Science}, 194 - Month = {July}, 195 - Number = {01}, 196 - Pages = {131-149}, 197 - Title = {The Token Economy: A Recent Review and Evaluation}, 198 - Url = {http://www.insikapub.com/Vol-02/No-01/12IJBAS(2)(1).pdf}, 199 - Volume = {02}, 200 - Year = {2013}, 201 - Bdsk-Url-1 = {http://www.insikapub.com/Vol-02/No-01/12IJBAS(2)(1).pdf}} 202 - 203 -@webpage{1, 204 - Date-Modified = {2014-11-27 16:36:09 +0000}, 205 - Owner = {nicklambert}, 206 - Timestamp = {2014.11.27}, 207 - Title = {Crypto-Currency Market Capitalizations}, 208 - Url = {https://coinmarketcap.com/all/}, 209 - Bdsk-Url-1 = {https://coinmarketcap.com/all/}}
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/rfcs/text/0009-mpid-messaging/MPID Message Flow.png version [e7d83bbd48].
cannot compute difference between binary files
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/rfcs/text/0011-improved-connection-management/Connection Management for Bootstrapping.png version [48210e296d].
cannot compute difference between binary files
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/rfcs/text/0011-improved-connection-management/Connection Management.png version [c128e0074a].
cannot compute difference between binary files
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/safe_examples/demo_app/resources/osx/helper_apps/Info EH.plist version [bb000d13a1].
1 -<?xml version="1.0" encoding="UTF-8"?> 2 -<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> 3 -<plist version="1.0"> 4 -<dict> 5 - <key>CFBundleDisplayName</key> 6 - <string>{{productName}} Helper EH</string> 7 - <key>CFBundleExecutable</key> 8 - <string>{{productName}} Helper EH</string> 9 - <key>CFBundleIdentifier</key> 10 - <string>{{identifier}}.helper.EH</string> 11 - <key>CFBundleName</key> 12 - <string>{{productName}} Helper EH</string> 13 - <key>CFBundlePackageType</key> 14 - <string>APPL</string> 15 - <key>DTSDKName</key> 16 - <string>macosx</string> 17 - <key>LSUIElement</key> 18 - <true/> 19 - <key>NSSupportsAutomaticGraphicsSwitching</key> 20 - <true/> 21 -</dict> 22 -</plist>
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/safe_examples/demo_app/resources/osx/helper_apps/Info NP.plist version [0a518159ab].
1 -<?xml version="1.0" encoding="UTF-8"?> 2 -<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> 3 -<plist version="1.0"> 4 -<dict> 5 - <key>CFBundleDisplayName</key> 6 - <string>{{productName}} Helper NP</string> 7 - <key>CFBundleExecutable</key> 8 - <string>{{productName}} Helper NP</string> 9 - <key>CFBundleIdentifier</key> 10 - <string>{{identifier}}.helper.NP</string> 11 - <key>CFBundleName</key> 12 - <string>{{productName}} Helper NP</string> 13 - <key>CFBundlePackageType</key> 14 - <string>APPL</string> 15 - <key>DTSDKName</key> 16 - <string>macosx</string> 17 - <key>LSUIElement</key> 18 - <true/> 19 - <key>NSSupportsAutomaticGraphicsSwitching</key> 20 - <true/> 21 -</dict> 22 -</plist>
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/safe_launcher/resources/osx/helper_apps/Info EH.plist version [bb000d13a1].
1 -<?xml version="1.0" encoding="UTF-8"?> 2 -<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> 3 -<plist version="1.0"> 4 -<dict> 5 - <key>CFBundleDisplayName</key> 6 - <string>{{productName}} Helper EH</string> 7 - <key>CFBundleExecutable</key> 8 - <string>{{productName}} Helper EH</string> 9 - <key>CFBundleIdentifier</key> 10 - <string>{{identifier}}.helper.EH</string> 11 - <key>CFBundleName</key> 12 - <string>{{productName}} Helper EH</string> 13 - <key>CFBundlePackageType</key> 14 - <string>APPL</string> 15 - <key>DTSDKName</key> 16 - <string>macosx</string> 17 - <key>LSUIElement</key> 18 - <true/> 19 - <key>NSSupportsAutomaticGraphicsSwitching</key> 20 - <true/> 21 -</dict> 22 -</plist>
Deleted wiki_references/2017/software/MaidSafe_net/src_from_GitHub/the_repository_clones/safe_launcher/resources/osx/helper_apps/Info NP.plist version [0a518159ab].
1 -<?xml version="1.0" encoding="UTF-8"?> 2 -<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> 3 -<plist version="1.0"> 4 -<dict> 5 - <key>CFBundleDisplayName</key> 6 - <string>{{productName}} Helper NP</string> 7 - <key>CFBundleExecutable</key> 8 - <string>{{productName}} Helper NP</string> 9 - <key>CFBundleIdentifier</key> 10 - <string>{{identifier}}.helper.NP</string> 11 - <key>CFBundleName</key> 12 - <string>{{productName}} Helper NP</string> 13 - <key>CFBundlePackageType</key> 14 - <string>APPL</string> 15 - <key>DTSDKName</key> 16 - <string>macosx</string> 17 - <key>LSUIElement</key> 18 - <true/> 19 - <key>NSSupportsAutomaticGraphicsSwitching</key> 20 - <true/> 21 -</dict> 22 -</plist>
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/Cargo.toml version [6662c28d93].
1 -[package] 2 -authors = ["The Rust Project Developers"] 3 -name = "bootstrap" 4 -version = "0.0.0" 5 - 6 -[lib] 7 -name = "bootstrap" 8 -path = "lib.rs" 9 -doctest = false 10 - 11 -[[bin]] 12 -name = "bootstrap" 13 -path = "bin/main.rs" 14 -test = false 15 - 16 -[[bin]] 17 -name = "rustc" 18 -path = "bin/rustc.rs" 19 -test = false 20 - 21 -[[bin]] 22 -name = "rustdoc" 23 -path = "bin/rustdoc.rs" 24 -test = false 25 - 26 -[[bin]] 27 -name = "sccache-plus-cl" 28 -path = "bin/sccache-plus-cl.rs" 29 -test = false 30 - 31 -[dependencies] 32 -build_helper = { path = "../build_helper" } 33 -cmake = "0.1.23" 34 -filetime = "0.1" 35 -num_cpus = "1.0" 36 -toml = "0.1" 37 -getopts = "0.2" 38 -rustc-serialize = "0.3" 39 -gcc = "0.3.46" 40 -libc = "0.2"
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/README.md version [2bda9352cd].
1 -# rustbuild - Bootstrapping Rust 2 - 3 -This is an in-progress README which is targeted at helping to explain how Rust 4 -is bootstrapped and in general some of the technical details of the build 5 -system. 6 - 7 -## Using rustbuild 8 - 9 -The rustbuild build system has a primary entry point, a top level `x.py` script: 10 - 11 -``` 12 -python ./x.py build 13 -``` 14 - 15 -Note that if you're on Unix you should be able to execute the script directly: 16 - 17 -``` 18 -./x.py build 19 -``` 20 - 21 -The script accepts commands, flags, and arguments to determine what to do: 22 - 23 -* `build` - a general purpose command for compiling code. Alone `build` will 24 - bootstrap the entire compiler, and otherwise arguments passed indicate what to 25 - build. For example: 26 - 27 - ``` 28 - # build the whole compiler 29 - ./x.py build 30 - 31 - # build the stage1 compiler 32 - ./x.py build --stage 1 33 - 34 - # build stage0 libstd 35 - ./x.py build --stage 0 src/libstd 36 - 37 - # build a particular crate in stage0 38 - ./x.py build --stage 0 src/libtest 39 - ``` 40 - 41 - If files are dirty that would normally be rebuilt from stage 0, that can be 42 - overidden using `--keep-stage 0`. Using `--keep-stage n` will skip all steps 43 - that belong to stage n or earlier: 44 - 45 - ``` 46 - # keep old build products for stage 0 and build stage 1 47 - ./x.py build --keep-stage 0 --stage 1 48 - ``` 49 - 50 -* `test` - a command for executing unit tests. Like the `build` command this 51 - will execute the entire test suite by default, and otherwise it can be used to 52 - select which test suite is run: 53 - 54 - ``` 55 - # run all unit tests 56 - ./x.py test 57 - 58 - # execute the run-pass test suite 59 - ./x.py test src/test/run-pass 60 - 61 - # execute only some tests in the run-pass test suite 62 - ./x.py test src/test/run-pass --test-args substring-of-test-name 63 - 64 - # execute tests in the standard library in stage0 65 - ./x.py test --stage 0 src/libstd 66 - 67 - # execute all doc tests 68 - ./x.py test src/doc 69 - ``` 70 - 71 -* `doc` - a command for building documentation. Like above can take arguments 72 - for what to document. 73 - 74 -## Configuring rustbuild 75 - 76 -There are currently two primary methods for configuring the rustbuild build 77 -system. First, the `./configure` options serialized in `config.mk` will be 78 -parsed and read. That is, if any `./configure` options are passed, they'll be 79 -handled naturally. 80 - 81 -Next, rustbuild offers a TOML-based configuration system with a `config.toml` 82 -file in the same location as `config.mk`. An example of this configuration can 83 -be found at `src/bootstrap/config.toml.example`, and the configuration file 84 -can also be passed as `--config path/to/config.toml` if the build system is 85 -being invoked manually (via the python script). 86 - 87 -Finally, rustbuild makes use of the [gcc-rs crate] which has [its own 88 -method][env-vars] of configuring C compilers and C flags via environment 89 -variables. 90 - 91 -[gcc-rs crate]: https://github.com/alexcrichton/gcc-rs 92 -[env-vars]: https://github.com/alexcrichton/gcc-rs#external-configuration-via-environment-variables 93 - 94 -## Build stages 95 - 96 -The rustbuild build system goes through a few phases to actually build the 97 -compiler. What actually happens when you invoke rustbuild is: 98 - 99 -1. The entry point script, `x.py` is run. This script is 100 - responsible for downloading the stage0 compiler/Cargo binaries, and it then 101 - compiles the build system itself (this folder). Finally, it then invokes the 102 - actual `bootstrap` binary build system. 103 -2. In Rust, `bootstrap` will slurp up all configuration, perform a number of 104 - sanity checks (compilers exist for example), and then start building the 105 - stage0 artifacts. 106 -3. The stage0 `cargo` downloaded earlier is used to build the standard library 107 - and the compiler, and then these binaries are then copied to the `stage1` 108 - directory. That compiler is then used to generate the stage1 artifacts which 109 - are then copied to the stage2 directory, and then finally the stage2 110 - artifacts are generated using that compiler. 111 - 112 -The goal of each stage is to (a) leverage Cargo as much as possible and failing 113 -that (b) leverage Rust as much as possible! 114 - 115 -## Incremental builds 116 - 117 -You can configure rustbuild to use incremental compilation. Because 118 -incremental is new and evolving rapidly, if you want to use it, it is 119 -recommended that you replace the snapshot with a locally installed 120 -nightly build of rustc. You will want to keep this up to date. 121 - 122 -To follow this course of action, first thing you will want to do is to 123 -install a nightly, presumably using `rustup`. You will then want to 124 -configure your directory to use this build, like so: 125 - 126 -``` 127 -# configure to use local rust instead of downloding a beta. 128 -# `--local-rust-root` is optional here. If elided, we will 129 -# use whatever rustc we find on your PATH. 130 -> configure --enable-rustbuild --local-rust-root=~/.cargo/ --enable-local-rebuild 131 -``` 132 - 133 -After that, you can use the `--incremental` flag to actually do 134 -incremental builds: 135 - 136 -``` 137 -> ../x.py build --incremental 138 -``` 139 - 140 -The `--incremental` flag will store incremental compilation artifacts 141 -in `build/<host>/stage0-incremental`. Note that we only use incremental 142 -compilation for the stage0 -> stage1 compilation -- this is because 143 -the stage1 compiler is changing, and we don't try to cache and reuse 144 -incremental artifacts across different versions of the compiler. For 145 -this reason, `--incremental` defaults to `--stage 1` (though you can 146 -manually select a higher stage, if you prefer). 147 - 148 -You can always drop the `--incremental` to build as normal (but you 149 -will still be using the local nightly as your bootstrap). 150 - 151 -## Directory Layout 152 - 153 -This build system houses all output under the `build` directory, which looks 154 -like this: 155 - 156 -``` 157 -# Root folder of all output. Everything is scoped underneath here 158 -build/ 159 - 160 - # Location where the stage0 compiler downloads are all cached. This directory 161 - # only contains the tarballs themselves as they're extracted elsewhere. 162 - cache/ 163 - 2015-12-19/ 164 - 2016-01-15/ 165 - 2016-01-21/ 166 - ... 167 - 168 - # Output directory for building this build system itself. The stage0 169 - # cargo/rustc are used to build the build system into this location. 170 - bootstrap/ 171 - debug/ 172 - release/ 173 - 174 - # Output of the dist-related steps like dist-std, dist-rustc, and dist-docs 175 - dist/ 176 - 177 - # Temporary directory used for various input/output as part of various stages 178 - tmp/ 179 - 180 - # Each remaining directory is scoped by the "host" triple of compilation at 181 - # hand. 182 - x86_64-unknown-linux-gnu/ 183 - 184 - # The build artifacts for the `compiler-rt` library for the target this 185 - # folder is under. The exact layout here will likely depend on the platform, 186 - # and this is also built with CMake so the build system is also likely 187 - # different. 188 - compiler-rt/ 189 - build/ 190 - 191 - # Output folder for LLVM if it is compiled for this target 192 - llvm/ 193 - 194 - # build folder (e.g. the platform-specific build system). Like with 195 - # compiler-rt this is compiled with CMake 196 - build/ 197 - 198 - # Installation of LLVM. Note that we run the equivalent of 'make install' 199 - # for LLVM to setup these folders. 200 - bin/ 201 - lib/ 202 - include/ 203 - share/ 204 - ... 205 - 206 - # Output folder for all documentation of this target. This is what's filled 207 - # in whenever the `doc` step is run. 208 - doc/ 209 - 210 - # Output for all compiletest-based test suites 211 - test/ 212 - run-pass/ 213 - compile-fail/ 214 - debuginfo/ 215 - ... 216 - 217 - # Location where the stage0 Cargo and Rust compiler are unpacked. This 218 - # directory is purely an extracted and overlaid tarball of these two (done 219 - # by the bootstrapy python script). In theory the build system does not 220 - # modify anything under this directory afterwards. 221 - stage0/ 222 - 223 - # These to build directories are the cargo output directories for builds of 224 - # the standard library and compiler, respectively. Internally these may also 225 - # have other target directories, which represent artifacts being compiled 226 - # from the host to the specified target. 227 - # 228 - # Essentially, each of these directories is filled in by one `cargo` 229 - # invocation. The build system instruments calling Cargo in the right order 230 - # with the right variables to ensure these are filled in correctly. 231 - stageN-std/ 232 - stageN-test/ 233 - stageN-rustc/ 234 - stageN-tools/ 235 - 236 - # This is a special case of the above directories, **not** filled in via 237 - # Cargo but rather the build system itself. The stage0 compiler already has 238 - # a set of target libraries for its own host triple (in its own sysroot) 239 - # inside of stage0/. When we run the stage0 compiler to bootstrap more 240 - # things, however, we don't want to use any of these libraries (as those are 241 - # the ones that we're building). So essentially, when the stage1 compiler is 242 - # being compiled (e.g. after libstd has been built), *this* is used as the 243 - # sysroot for the stage0 compiler being run. 244 - # 245 - # Basically this directory is just a temporary artifact use to configure the 246 - # stage0 compiler to ensure that the libstd we just built is used to 247 - # compile the stage1 compiler. 248 - stage0-sysroot/lib/ 249 - 250 - # These output directories are intended to be standalone working 251 - # implementations of the compiler (corresponding to each stage). The build 252 - # system will link (using hard links) output from stageN-{std,rustc} into 253 - # each of these directories. 254 - # 255 - # In theory there is no extra build output in these directories. 256 - stage1/ 257 - stage2/ 258 - stage3/ 259 -``` 260 - 261 -## Cargo projects 262 - 263 -The current build is unfortunately not quite as simple as `cargo build` in a 264 -directory, but rather the compiler is split into three different Cargo projects: 265 - 266 -* `src/libstd` - the standard library 267 -* `src/libtest` - testing support, depends on libstd 268 -* `src/rustc` - the actual compiler itself 269 - 270 -Each "project" has a corresponding Cargo.lock file with all dependencies, and 271 -this means that building the compiler involves running Cargo three times. The 272 -structure here serves two goals: 273 - 274 -1. Facilitating dependencies coming from crates.io. These dependencies don't 275 - depend on `std`, so libstd is a separate project compiled ahead of time 276 - before the actual compiler builds. 277 -2. Splitting "host artifacts" from "target artifacts". That is, when building 278 - code for an arbitrary target you don't need the entire compiler, but you'll 279 - end up needing libraries like libtest that depend on std but also want to use 280 - crates.io dependencies. Hence, libtest is split out as its own project that 281 - is sequenced after `std` but before `rustc`. This project is built for all 282 - targets. 283 - 284 -There is some loss in build parallelism here because libtest can be compiled in 285 -parallel with a number of rustc artifacts, but in theory the loss isn't too bad! 286 - 287 -## Build tools 288 - 289 -We've actually got quite a few tools that we use in the compiler's build system 290 -and for testing. To organize these, each tool is a project in `src/tools` with a 291 -corresponding `Cargo.toml`. All tools are compiled with Cargo (currently having 292 -independent `Cargo.lock` files) and do not currently explicitly depend on the 293 -compiler or standard library. Compiling each tool is sequenced after the 294 -appropriate libstd/libtest/librustc compile above. 295 - 296 -## Extending rustbuild 297 - 298 -So you'd like to add a feature to the rustbuild build system or just fix a bug. 299 -Great! One of the major motivational factors for moving away from `make` is that 300 -Rust is in theory much easier to read, modify, and write. If you find anything 301 -excessively confusing, please open an issue on this and we'll try to get it 302 -documented or simplified pronto. 303 - 304 -First up, you'll probably want to read over the documentation above as that'll 305 -give you a high level overview of what rustbuild is doing. You also probably 306 -want to play around a bit yourself by just getting it up and running before you 307 -dive too much into the actual build system itself. 308 - 309 -After that, each module in rustbuild should have enough documentation to keep 310 -you up and running. Some general areas that you may be interested in modifying 311 -are: 312 - 313 -* Adding a new build tool? Take a look at `bootstrap/step.rs` for examples of 314 - other tools. 315 -* Adding a new compiler crate? Look no further! Adding crates can be done by 316 - adding a new directory with `Cargo.toml` followed by configuring all 317 - `Cargo.toml` files accordingly. 318 -* Adding a new dependency from crates.io? We're still working on that, so hold 319 - off on that for now. 320 -* Adding a new configuration option? Take a look at `bootstrap/config.rs` or 321 - perhaps `bootstrap/flags.rs` and then modify the build elsewhere to read that 322 - option. 323 -* Adding a sanity check? Take a look at `bootstrap/sanity.rs`. 324 - 325 -If you have any questions feel free to reach out on `#rust-internals` on IRC or 326 -open an issue in the bug tracker!
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/bin/main.rs version [46a25b876b].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! rustbuild, the Rust build system 12 -//! 13 -//! This is the entry point for the build system used to compile the `rustc` 14 -//! compiler. Lots of documentation can be found in the `README.md` file in the 15 -//! parent directory, and otherwise documentation can be found throughout the `build` 16 -//! directory in each respective module. 17 - 18 -#![deny(warnings)] 19 - 20 -extern crate bootstrap; 21 - 22 -use std::env; 23 - 24 -use bootstrap::{Flags, Config, Build}; 25 - 26 -fn main() { 27 - let args = env::args().skip(1).collect::<Vec<_>>(); 28 - let flags = Flags::parse(&args); 29 - let mut config = Config::parse(&flags.build, flags.config.clone()); 30 - 31 - // compat with `./configure` while we're still using that 32 - if std::fs::metadata("config.mk").is_ok() { 33 - config.update_with_config_mk(); 34 - } 35 - 36 - Build::new(flags, config).build(); 37 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/bin/rustc.rs version [6d279a313b].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Shim which is passed to Cargo as "rustc" when running the bootstrap. 12 -//! 13 -//! This shim will take care of some various tasks that our build process 14 -//! requires that Cargo can't quite do through normal configuration: 15 -//! 16 -//! 1. When compiling build scripts and build dependencies, we need a guaranteed 17 -//! full standard library available. The only compiler which actually has 18 -//! this is the snapshot, so we detect this situation and always compile with 19 -//! the snapshot compiler. 20 -//! 2. We pass a bunch of `--cfg` and other flags based on what we're compiling 21 -//! (and this slightly differs based on a whether we're using a snapshot or 22 -//! not), so we do that all here. 23 -//! 24 -//! This may one day be replaced by RUSTFLAGS, but the dynamic nature of 25 -//! switching compilers for the bootstrap and for build scripts will probably 26 -//! never get replaced. 27 - 28 -#![deny(warnings)] 29 - 30 -extern crate bootstrap; 31 - 32 -use std::env; 33 -use std::ffi::OsString; 34 -use std::io; 35 -use std::io::prelude::*; 36 -use std::str::FromStr; 37 -use std::path::PathBuf; 38 -use std::process::{Command, ExitStatus}; 39 - 40 -fn main() { 41 - let args = env::args_os().skip(1).collect::<Vec<_>>(); 42 - // Detect whether or not we're a build script depending on whether --target 43 - // is passed (a bit janky...) 44 - let target = args.windows(2) 45 - .find(|w| &*w[0] == "--target") 46 - .and_then(|w| w[1].to_str()); 47 - let version = args.iter().find(|w| &**w == "-vV"); 48 - 49 - let verbose = match env::var("RUSTC_VERBOSE") { 50 - Ok(s) => usize::from_str(&s).expect("RUSTC_VERBOSE should be an integer"), 51 - Err(_) => 0, 52 - }; 53 - 54 - // Build scripts always use the snapshot compiler which is guaranteed to be 55 - // able to produce an executable, whereas intermediate compilers may not 56 - // have the standard library built yet and may not be able to produce an 57 - // executable. Otherwise we just use the standard compiler we're 58 - // bootstrapping with. 59 - // 60 - // Also note that cargo will detect the version of the compiler to trigger 61 - // a rebuild when the compiler changes. If this happens, we want to make 62 - // sure to use the actual compiler instead of the snapshot compiler becase 63 - // that's the one that's actually changing. 64 - let (rustc, libdir) = if target.is_none() && version.is_none() { 65 - ("RUSTC_SNAPSHOT", "RUSTC_SNAPSHOT_LIBDIR") 66 - } else { 67 - ("RUSTC_REAL", "RUSTC_LIBDIR") 68 - }; 69 - let stage = env::var("RUSTC_STAGE").expect("RUSTC_STAGE was not set"); 70 - let sysroot = env::var_os("RUSTC_SYSROOT").expect("RUSTC_SYSROOT was not set"); 71 - let mut on_fail = env::var_os("RUSTC_ON_FAIL").map(|of| Command::new(of)); 72 - 73 - let rustc = env::var_os(rustc).unwrap_or_else(|| panic!("{:?} was not set", rustc)); 74 - let libdir = env::var_os(libdir).unwrap_or_else(|| panic!("{:?} was not set", libdir)); 75 - let mut dylib_path = bootstrap::util::dylib_path(); 76 - dylib_path.insert(0, PathBuf::from(libdir)); 77 - 78 - let mut cmd = Command::new(rustc); 79 - cmd.args(&args) 80 - .arg("--cfg") 81 - .arg(format!("stage{}", stage)) 82 - .env(bootstrap::util::dylib_path_var(), 83 - env::join_paths(&dylib_path).unwrap()); 84 - 85 - if let Some(target) = target { 86 - // The stage0 compiler has a special sysroot distinct from what we 87 - // actually downloaded, so we just always pass the `--sysroot` option. 88 - cmd.arg("--sysroot").arg(sysroot); 89 - 90 - // When we build Rust dylibs they're all intended for intermediate 91 - // usage, so make sure we pass the -Cprefer-dynamic flag instead of 92 - // linking all deps statically into the dylib. 93 - if env::var_os("RUSTC_NO_PREFER_DYNAMIC").is_none() { 94 - cmd.arg("-Cprefer-dynamic"); 95 - } 96 - 97 - // Pass the `rustbuild` feature flag to crates which rustbuild is 98 - // building. See the comment in bootstrap/lib.rs where this env var is 99 - // set for more details. 100 - if env::var_os("RUSTBUILD_UNSTABLE").is_some() { 101 - cmd.arg("--cfg").arg("rustbuild"); 102 - } 103 - 104 - // Help the libc crate compile by assisting it in finding the MUSL 105 - // native libraries. 106 - if let Some(s) = env::var_os("MUSL_ROOT") { 107 - let mut root = OsString::from("native="); 108 - root.push(&s); 109 - root.push("/lib"); 110 - cmd.arg("-L").arg(&root); 111 - } 112 - 113 - // Pass down extra flags, commonly used to configure `-Clinker` when 114 - // cross compiling. 115 - if let Ok(s) = env::var("RUSTC_FLAGS") { 116 - cmd.args(&s.split(" ").filter(|s| !s.is_empty()).collect::<Vec<_>>()); 117 - } 118 - 119 - // Pass down incremental directory, if any. 120 - if let Ok(dir) = env::var("RUSTC_INCREMENTAL") { 121 - cmd.arg(format!("-Zincremental={}", dir)); 122 - 123 - if verbose > 0 { 124 - cmd.arg("-Zincremental-info"); 125 - } 126 - } 127 - 128 - // If we're compiling specifically the `panic_abort` crate then we pass 129 - // the `-C panic=abort` option. Note that we do not do this for any 130 - // other crate intentionally as this is the only crate for now that we 131 - // ship with panic=abort. 132 - // 133 - // This... is a bit of a hack how we detect this. Ideally this 134 - // information should be encoded in the crate I guess? Would likely 135 - // require an RFC amendment to RFC 1513, however. 136 - let is_panic_abort = args.windows(2) 137 - .any(|a| &*a[0] == "--crate-name" && &*a[1] == "panic_abort"); 138 - if is_panic_abort { 139 - cmd.arg("-C").arg("panic=abort"); 140 - } 141 - 142 - // Set various options from config.toml to configure how we're building 143 - // code. 144 - if env::var("RUSTC_DEBUGINFO") == Ok("true".to_string()) { 145 - cmd.arg("-g"); 146 - } else if env::var("RUSTC_DEBUGINFO_LINES") == Ok("true".to_string()) { 147 - cmd.arg("-Cdebuginfo=1"); 148 - } 149 - let debug_assertions = match env::var("RUSTC_DEBUG_ASSERTIONS") { 150 - Ok(s) => if s == "true" { "y" } else { "n" }, 151 - Err(..) => "n", 152 - }; 153 - cmd.arg("-C").arg(format!("debug-assertions={}", debug_assertions)); 154 - if let Ok(s) = env::var("RUSTC_CODEGEN_UNITS") { 155 - cmd.arg("-C").arg(format!("codegen-units={}", s)); 156 - } 157 - 158 - // Emit save-analysis info. 159 - if env::var("RUSTC_SAVE_ANALYSIS") == Ok("api".to_string()) { 160 - cmd.arg("-Zsave-analysis-api"); 161 - } 162 - 163 - // Dealing with rpath here is a little special, so let's go into some 164 - // detail. First off, `-rpath` is a linker option on Unix platforms 165 - // which adds to the runtime dynamic loader path when looking for 166 - // dynamic libraries. We use this by default on Unix platforms to ensure 167 - // that our nightlies behave the same on Windows, that is they work out 168 - // of the box. This can be disabled, of course, but basically that's why 169 - // we're gated on RUSTC_RPATH here. 170 - // 171 - // Ok, so the astute might be wondering "why isn't `-C rpath` used 172 - // here?" and that is indeed a good question to task. This codegen 173 - // option is the compiler's current interface to generating an rpath. 174 - // Unfortunately it doesn't quite suffice for us. The flag currently 175 - // takes no value as an argument, so the compiler calculates what it 176 - // should pass to the linker as `-rpath`. This unfortunately is based on 177 - // the **compile time** directory structure which when building with 178 - // Cargo will be very different than the runtime directory structure. 179 - // 180 - // All that's a really long winded way of saying that if we use 181 - // `-Crpath` then the executables generated have the wrong rpath of 182 - // something like `$ORIGIN/deps` when in fact the way we distribute 183 - // rustc requires the rpath to be `$ORIGIN/../lib`. 184 - // 185 - // So, all in all, to set up the correct rpath we pass the linker 186 - // argument manually via `-C link-args=-Wl,-rpath,...`. Plus isn't it 187 - // fun to pass a flag to a tool to pass a flag to pass a flag to a tool 188 - // to change a flag in a binary? 189 - if env::var("RUSTC_RPATH") == Ok("true".to_string()) { 190 - let rpath = if target.contains("apple") { 191 - 192 - // Note that we need to take one extra step on macOS to also pass 193 - // `-Wl,-instal_name,@rpath/...` to get things to work right. To 194 - // do that we pass a weird flag to the compiler to get it to do 195 - // so. Note that this is definitely a hack, and we should likely 196 - // flesh out rpath support more fully in the future. 197 - // 198 - // FIXME: remove condition after next stage0 199 - if stage != "0" { 200 - cmd.arg("-Z").arg("osx-rpath-install-name"); 201 - } 202 - Some("-Wl,-rpath,@loader_path/../lib") 203 - } else if !target.contains("windows") { 204 - Some("-Wl,-rpath,$ORIGIN/../lib") 205 - } else { 206 - None 207 - }; 208 - if let Some(rpath) = rpath { 209 - cmd.arg("-C").arg(format!("link-args={}", rpath)); 210 - } 211 - 212 - if let Ok(s) = env::var("RUSTFLAGS") { 213 - for flag in s.split_whitespace() { 214 - cmd.arg(flag); 215 - } 216 - } 217 - } 218 - 219 - if target.contains("pc-windows-msvc") { 220 - cmd.arg("-Z").arg("unstable-options"); 221 - cmd.arg("-C").arg("target-feature=+crt-static"); 222 - } 223 - 224 - // Force all crates compiled by this compiler to (a) be unstable and (b) 225 - // allow the `rustc_private` feature to link to other unstable crates 226 - // also in the sysroot. 227 - // 228 - // FIXME: remove condition after next stage0 229 - if env::var_os("RUSTC_FORCE_UNSTABLE").is_some() { 230 - if stage != "0" { 231 - cmd.arg("-Z").arg("force-unstable-if-unmarked"); 232 - } 233 - } 234 - } 235 - 236 - if verbose > 1 { 237 - writeln!(&mut io::stderr(), "rustc command: {:?}", cmd).unwrap(); 238 - } 239 - 240 - // Actually run the compiler! 241 - std::process::exit(if let Some(ref mut on_fail) = on_fail { 242 - match cmd.status() { 243 - Ok(s) if s.success() => 0, 244 - _ => { 245 - println!("\nDid not run successfully:\n{:?}\n-------------", cmd); 246 - exec_cmd(on_fail).expect("could not run the backup command"); 247 - 1 248 - } 249 - } 250 - } else { 251 - std::process::exit(match exec_cmd(&mut cmd) { 252 - Ok(s) => s.code().unwrap_or(0xfe), 253 - Err(e) => panic!("\n\nfailed to run {:?}: {}\n\n", cmd, e), 254 - }) 255 - }) 256 -} 257 - 258 -#[cfg(unix)] 259 -fn exec_cmd(cmd: &mut Command) -> ::std::io::Result<ExitStatus> { 260 - use std::os::unix::process::CommandExt; 261 - Err(cmd.exec()) 262 -} 263 - 264 -#[cfg(not(unix))] 265 -fn exec_cmd(cmd: &mut Command) -> ::std::io::Result<ExitStatus> { 266 - cmd.status() 267 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/bin/rustdoc.rs version [db1a550d09].
1 -// Copyright 2016 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Shim which is passed to Cargo as "rustdoc" when running the bootstrap. 12 -//! 13 -//! See comments in `src/bootstrap/rustc.rs` for more information. 14 - 15 -#![deny(warnings)] 16 - 17 -extern crate bootstrap; 18 - 19 -use std::env; 20 -use std::process::Command; 21 -use std::path::PathBuf; 22 - 23 -fn main() { 24 - let args = env::args_os().skip(1).collect::<Vec<_>>(); 25 - let rustdoc = env::var_os("RUSTDOC_REAL").expect("RUSTDOC_REAL was not set"); 26 - let libdir = env::var_os("RUSTC_LIBDIR").expect("RUSTC_LIBDIR was not set"); 27 - let stage = env::var("RUSTC_STAGE").expect("RUSTC_STAGE was not set"); 28 - let sysroot = env::var_os("RUSTC_SYSROOT").expect("RUSTC_SYSROOT was not set"); 29 - 30 - let mut dylib_path = bootstrap::util::dylib_path(); 31 - dylib_path.insert(0, PathBuf::from(libdir)); 32 - 33 - let mut cmd = Command::new(rustdoc); 34 - cmd.args(&args) 35 - .arg("--cfg") 36 - .arg(format!("stage{}", stage)) 37 - .arg("--cfg") 38 - .arg("dox") 39 - .arg("--sysroot") 40 - .arg(sysroot) 41 - .env(bootstrap::util::dylib_path_var(), 42 - env::join_paths(&dylib_path).unwrap()); 43 - 44 - // Pass the `rustbuild` feature flag to crates which rustbuild is 45 - // building. See the comment in bootstrap/lib.rs where this env var is 46 - // set for more details. 47 - if env::var_os("RUSTBUILD_UNSTABLE").is_some() { 48 - cmd.arg("--cfg").arg("rustbuild"); 49 - } 50 - 51 - std::process::exit(match cmd.status() { 52 - Ok(s) => s.code().unwrap_or(1), 53 - Err(e) => panic!("\n\nfailed to run {:?}: {}\n\n", cmd, e), 54 - }) 55 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/bin/sccache-plus-cl.rs version [d4d6af0476].
1 -// Copyright 2017 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -extern crate gcc; 12 - 13 -use std::env; 14 -use std::process::{self, Command}; 15 - 16 -fn main() { 17 - let target = env::var("SCCACHE_TARGET").unwrap(); 18 - // Locate the actual compiler that we're invoking 19 - env::remove_var("CC"); 20 - env::remove_var("CXX"); 21 - let mut cfg = gcc::Config::new(); 22 - cfg.cargo_metadata(false) 23 - .out_dir("/") 24 - .target(&target) 25 - .host(&target) 26 - .opt_level(0) 27 - .debug(false); 28 - let compiler = cfg.get_compiler(); 29 - 30 - // Invoke sccache with said compiler 31 - let sccache_path = env::var_os("SCCACHE_PATH").unwrap(); 32 - let mut cmd = Command::new(&sccache_path); 33 - cmd.arg(compiler.path()); 34 - for &(ref k, ref v) in compiler.env() { 35 - cmd.env(k, v); 36 - } 37 - for arg in env::args().skip(1) { 38 - cmd.arg(arg); 39 - } 40 - 41 - let status = cmd.status().expect("failed to spawn"); 42 - process::exit(status.code().unwrap_or(2)) 43 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/bootstrap.py version [46769d89ad].
1 -# Copyright 2015-2016 The Rust Project Developers. See the COPYRIGHT 2 -# file at the top-level directory of this distribution and at 3 -# http://rust-lang.org/COPYRIGHT. 4 -# 5 -# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -# option. This file may not be copied, modified, or distributed 9 -# except according to those terms. 10 - 11 -from __future__ import print_function 12 -import argparse 13 -import contextlib 14 -import datetime 15 -import hashlib 16 -import os 17 -import shutil 18 -import subprocess 19 -import sys 20 -import tarfile 21 -import tempfile 22 - 23 -from time import time 24 - 25 - 26 -def get(url, path, verbose=False): 27 - sha_url = url + ".sha256" 28 - with tempfile.NamedTemporaryFile(delete=False) as temp_file: 29 - temp_path = temp_file.name 30 - with tempfile.NamedTemporaryFile(suffix=".sha256", delete=False) as sha_file: 31 - sha_path = sha_file.name 32 - 33 - try: 34 - download(sha_path, sha_url, False, verbose) 35 - if os.path.exists(path): 36 - if verify(path, sha_path, False): 37 - if verbose: 38 - print("using already-download file " + path) 39 - return 40 - else: 41 - if verbose: 42 - print("ignoring already-download file " + path + " due to failed verification") 43 - os.unlink(path) 44 - download(temp_path, url, True, verbose) 45 - if not verify(temp_path, sha_path, verbose): 46 - raise RuntimeError("failed verification") 47 - if verbose: 48 - print("moving {} to {}".format(temp_path, path)) 49 - shutil.move(temp_path, path) 50 - finally: 51 - delete_if_present(sha_path, verbose) 52 - delete_if_present(temp_path, verbose) 53 - 54 - 55 -def delete_if_present(path, verbose): 56 - if os.path.isfile(path): 57 - if verbose: 58 - print("removing " + path) 59 - os.unlink(path) 60 - 61 - 62 -def download(path, url, probably_big, verbose): 63 - for x in range(0, 4): 64 - try: 65 - _download(path, url, probably_big, verbose, True) 66 - return 67 - except RuntimeError: 68 - print("\nspurious failure, trying again") 69 - _download(path, url, probably_big, verbose, False) 70 - 71 - 72 -def _download(path, url, probably_big, verbose, exception): 73 - if probably_big or verbose: 74 - print("downloading {}".format(url)) 75 - # see http://serverfault.com/questions/301128/how-to-download 76 - if sys.platform == 'win32': 77 - run(["PowerShell.exe", "/nologo", "-Command", 78 - "(New-Object System.Net.WebClient)" 79 - ".DownloadFile('{}', '{}')".format(url, path)], 80 - verbose=verbose, 81 - exception=exception) 82 - else: 83 - if probably_big or verbose: 84 - option = "-#" 85 - else: 86 - option = "-s" 87 - run(["curl", option, "--retry", "3", "-Sf", "-o", path, url], 88 - verbose=verbose, 89 - exception=exception) 90 - 91 - 92 -def verify(path, sha_path, verbose): 93 - if verbose: 94 - print("verifying " + path) 95 - with open(path, "rb") as f: 96 - found = hashlib.sha256(f.read()).hexdigest() 97 - with open(sha_path, "r") as f: 98 - expected = f.readline().split()[0] 99 - verified = found == expected 100 - if not verified: 101 - print("invalid checksum:\n" 102 - " found: {}\n" 103 - " expected: {}".format(found, expected)) 104 - return verified 105 - 106 - 107 -def unpack(tarball, dst, verbose=False, match=None): 108 - print("extracting " + tarball) 109 - fname = os.path.basename(tarball).replace(".tar.gz", "") 110 - with contextlib.closing(tarfile.open(tarball)) as tar: 111 - for p in tar.getnames(): 112 - if "/" not in p: 113 - continue 114 - name = p.replace(fname + "/", "", 1) 115 - if match is not None and not name.startswith(match): 116 - continue 117 - name = name[len(match) + 1:] 118 - 119 - fp = os.path.join(dst, name) 120 - if verbose: 121 - print(" extracting " + p) 122 - tar.extract(p, dst) 123 - tp = os.path.join(dst, p) 124 - if os.path.isdir(tp) and os.path.exists(fp): 125 - continue 126 - shutil.move(tp, fp) 127 - shutil.rmtree(os.path.join(dst, fname)) 128 - 129 -def run(args, verbose=False, exception=False): 130 - if verbose: 131 - print("running: " + ' '.join(args)) 132 - sys.stdout.flush() 133 - # Use Popen here instead of call() as it apparently allows powershell on 134 - # Windows to not lock up waiting for input presumably. 135 - ret = subprocess.Popen(args) 136 - code = ret.wait() 137 - if code != 0: 138 - err = "failed to run: " + ' '.join(args) 139 - if verbose or exception: 140 - raise RuntimeError(err) 141 - sys.exit(err) 142 - 143 -def stage0_data(rust_root): 144 - nightlies = os.path.join(rust_root, "src/stage0.txt") 145 - data = {} 146 - with open(nightlies, 'r') as nightlies: 147 - for line in nightlies: 148 - line = line.rstrip() # Strip newline character, '\n' 149 - if line.startswith("#") or line == '': 150 - continue 151 - a, b = line.split(": ", 1) 152 - data[a] = b 153 - return data 154 - 155 -def format_build_time(duration): 156 - return str(datetime.timedelta(seconds=int(duration))) 157 - 158 - 159 -class RustBuild(object): 160 - def download_stage0(self): 161 - cache_dst = os.path.join(self.build_dir, "cache") 162 - rustc_cache = os.path.join(cache_dst, self.stage0_date()) 163 - if not os.path.exists(rustc_cache): 164 - os.makedirs(rustc_cache) 165 - 166 - rustc_channel = self.stage0_rustc_channel() 167 - cargo_channel = self.stage0_cargo_channel() 168 - 169 - if self.rustc().startswith(self.bin_root()) and \ 170 - (not os.path.exists(self.rustc()) or self.rustc_out_of_date()): 171 - self.print_what_it_means_to_bootstrap() 172 - if os.path.exists(self.bin_root()): 173 - shutil.rmtree(self.bin_root()) 174 - filename = "rust-std-{}-{}.tar.gz".format(rustc_channel, self.build) 175 - url = self._download_url + "/dist/" + self.stage0_date() 176 - tarball = os.path.join(rustc_cache, filename) 177 - if not os.path.exists(tarball): 178 - get("{}/{}".format(url, filename), tarball, verbose=self.verbose) 179 - unpack(tarball, self.bin_root(), 180 - match="rust-std-" + self.build, 181 - verbose=self.verbose) 182 - 183 - filename = "rustc-{}-{}.tar.gz".format(rustc_channel, self.build) 184 - url = self._download_url + "/dist/" + self.stage0_date() 185 - tarball = os.path.join(rustc_cache, filename) 186 - if not os.path.exists(tarball): 187 - get("{}/{}".format(url, filename), tarball, verbose=self.verbose) 188 - unpack(tarball, self.bin_root(), match="rustc", verbose=self.verbose) 189 - self.fix_executable(self.bin_root() + "/bin/rustc") 190 - self.fix_executable(self.bin_root() + "/bin/rustdoc") 191 - with open(self.rustc_stamp(), 'w') as f: 192 - f.write(self.stage0_date()) 193 - 194 - if "pc-windows-gnu" in self.build: 195 - filename = "rust-mingw-{}-{}.tar.gz".format(rustc_channel, self.build) 196 - url = self._download_url + "/dist/" + self.stage0_date() 197 - tarball = os.path.join(rustc_cache, filename) 198 - if not os.path.exists(tarball): 199 - get("{}/{}".format(url, filename), tarball, verbose=self.verbose) 200 - unpack(tarball, self.bin_root(), match="rust-mingw", verbose=self.verbose) 201 - 202 - if self.cargo().startswith(self.bin_root()) and \ 203 - (not os.path.exists(self.cargo()) or self.cargo_out_of_date()): 204 - self.print_what_it_means_to_bootstrap() 205 - filename = "cargo-{}-{}.tar.gz".format(cargo_channel, self.build) 206 - url = self._download_url + "/dist/" + self.stage0_date() 207 - tarball = os.path.join(rustc_cache, filename) 208 - if not os.path.exists(tarball): 209 - get("{}/{}".format(url, filename), tarball, verbose=self.verbose) 210 - unpack(tarball, self.bin_root(), match="cargo", verbose=self.verbose) 211 - self.fix_executable(self.bin_root() + "/bin/cargo") 212 - with open(self.cargo_stamp(), 'w') as f: 213 - f.write(self.stage0_date()) 214 - 215 - def fix_executable(self, fname): 216 - # If we're on NixOS we need to change the path to the dynamic loader 217 - 218 - default_encoding = sys.getdefaultencoding() 219 - try: 220 - ostype = subprocess.check_output(['uname', '-s']).strip().decode(default_encoding) 221 - except (subprocess.CalledProcessError, WindowsError): 222 - return 223 - 224 - if ostype != "Linux": 225 - return 226 - 227 - if not os.path.exists("/etc/NIXOS"): 228 - return 229 - if os.path.exists("/lib"): 230 - return 231 - 232 - # At this point we're pretty sure the user is running NixOS 233 - print("info: you seem to be running NixOS. Attempting to patch " + fname) 234 - 235 - try: 236 - interpreter = subprocess.check_output(["patchelf", "--print-interpreter", fname]) 237 - interpreter = interpreter.strip().decode(default_encoding) 238 - except subprocess.CalledProcessError as e: 239 - print("warning: failed to call patchelf: %s" % e) 240 - return 241 - 242 - loader = interpreter.split("/")[-1] 243 - 244 - try: 245 - ldd_output = subprocess.check_output(['ldd', '/run/current-system/sw/bin/sh']) 246 - ldd_output = ldd_output.strip().decode(default_encoding) 247 - except subprocess.CalledProcessError as e: 248 - print("warning: unable to call ldd: %s" % e) 249 - return 250 - 251 - for line in ldd_output.splitlines(): 252 - libname = line.split()[0] 253 - if libname.endswith(loader): 254 - loader_path = libname[:len(libname) - len(loader)] 255 - break 256 - else: 257 - print("warning: unable to find the path to the dynamic linker") 258 - return 259 - 260 - correct_interpreter = loader_path + loader 261 - 262 - try: 263 - subprocess.check_output(["patchelf", "--set-interpreter", correct_interpreter, fname]) 264 - except subprocess.CalledProcessError as e: 265 - print("warning: failed to call patchelf: %s" % e) 266 - return 267 - 268 - def stage0_date(self): 269 - return self._date 270 - 271 - def stage0_rustc_channel(self): 272 - return self._rustc_channel 273 - 274 - def stage0_cargo_channel(self): 275 - return self._cargo_channel 276 - 277 - def rustc_stamp(self): 278 - return os.path.join(self.bin_root(), '.rustc-stamp') 279 - 280 - def cargo_stamp(self): 281 - return os.path.join(self.bin_root(), '.cargo-stamp') 282 - 283 - def rustc_out_of_date(self): 284 - if not os.path.exists(self.rustc_stamp()) or self.clean: 285 - return True 286 - with open(self.rustc_stamp(), 'r') as f: 287 - return self.stage0_date() != f.read() 288 - 289 - def cargo_out_of_date(self): 290 - if not os.path.exists(self.cargo_stamp()) or self.clean: 291 - return True 292 - with open(self.cargo_stamp(), 'r') as f: 293 - return self.stage0_date() != f.read() 294 - 295 - def bin_root(self): 296 - return os.path.join(self.build_dir, self.build, "stage0") 297 - 298 - def get_toml(self, key): 299 - for line in self.config_toml.splitlines(): 300 - if line.startswith(key + ' ='): 301 - return self.get_string(line) 302 - return None 303 - 304 - def get_mk(self, key): 305 - for line in iter(self.config_mk.splitlines()): 306 - if line.startswith(key + ' '): 307 - var = line[line.find(':=') + 2:].strip() 308 - if var != '': 309 - return var 310 - return None 311 - 312 - def cargo(self): 313 - config = self.get_toml('cargo') 314 - if config: 315 - return config 316 - config = self.get_mk('CFG_LOCAL_RUST_ROOT') 317 - if config: 318 - return config + '/bin/cargo' + self.exe_suffix() 319 - return os.path.join(self.bin_root(), "bin/cargo" + self.exe_suffix()) 320 - 321 - def rustc(self): 322 - config = self.get_toml('rustc') 323 - if config: 324 - return config 325 - config = self.get_mk('CFG_LOCAL_RUST_ROOT') 326 - if config: 327 - return config + '/bin/rustc' + self.exe_suffix() 328 - return os.path.join(self.bin_root(), "bin/rustc" + self.exe_suffix()) 329 - 330 - def get_string(self, line): 331 - start = line.find('"') 332 - end = start + 1 + line[start + 1:].find('"') 333 - return line[start + 1:end] 334 - 335 - def exe_suffix(self): 336 - if sys.platform == 'win32': 337 - return '.exe' 338 - else: 339 - return '' 340 - 341 - def print_what_it_means_to_bootstrap(self): 342 - if hasattr(self, 'printed'): 343 - return 344 - self.printed = True 345 - if os.path.exists(self.bootstrap_binary()): 346 - return 347 - if not '--help' in sys.argv or len(sys.argv) == 1: 348 - return 349 - 350 - print('info: the build system for Rust is written in Rust, so this') 351 - print(' script is now going to download a stage0 rust compiler') 352 - print(' and then compile the build system itself') 353 - print('') 354 - print('info: in the meantime you can read more about rustbuild at') 355 - print(' src/bootstrap/README.md before the download finishes') 356 - 357 - def bootstrap_binary(self): 358 - return os.path.join(self.build_dir, "bootstrap/debug/bootstrap") 359 - 360 - def build_bootstrap(self): 361 - self.print_what_it_means_to_bootstrap() 362 - build_dir = os.path.join(self.build_dir, "bootstrap") 363 - if self.clean and os.path.exists(build_dir): 364 - shutil.rmtree(build_dir) 365 - env = os.environ.copy() 366 - env["CARGO_TARGET_DIR"] = build_dir 367 - env["RUSTC"] = self.rustc() 368 - env["LD_LIBRARY_PATH"] = os.path.join(self.bin_root(), "lib") + \ 369 - (os.pathsep + env["LD_LIBRARY_PATH"]) \ 370 - if "LD_LIBRARY_PATH" in env else "" 371 - env["DYLD_LIBRARY_PATH"] = os.path.join(self.bin_root(), "lib") + \ 372 - (os.pathsep + env["DYLD_LIBRARY_PATH"]) \ 373 - if "DYLD_LIBRARY_PATH" in env else "" 374 - env["LIBRARY_PATH"] = os.path.join(self.bin_root(), "lib") + \ 375 - (os.pathsep + env["LIBRARY_PATH"]) \ 376 - if "LIBRARY_PATH" in env else "" 377 - env["PATH"] = os.path.join(self.bin_root(), "bin") + \ 378 - os.pathsep + env["PATH"] 379 - if not os.path.isfile(self.cargo()): 380 - raise Exception("no cargo executable found at `%s`" % self.cargo()) 381 - args = [self.cargo(), "build", "--manifest-path", 382 - os.path.join(self.rust_root, "src/bootstrap/Cargo.toml")] 383 - if self.use_locked_deps: 384 - args.append("--locked") 385 - if self.use_vendored_sources: 386 - args.append("--frozen") 387 - self.run(args, env) 388 - 389 - def run(self, args, env): 390 - proc = subprocess.Popen(args, env=env) 391 - ret = proc.wait() 392 - if ret != 0: 393 - sys.exit(ret) 394 - 395 - def build_triple(self): 396 - default_encoding = sys.getdefaultencoding() 397 - config = self.get_toml('build') 398 - if config: 399 - return config 400 - config = self.get_mk('CFG_BUILD') 401 - if config: 402 - return config 403 - try: 404 - ostype = subprocess.check_output(['uname', '-s']).strip().decode(default_encoding) 405 - cputype = subprocess.check_output(['uname', '-m']).strip().decode(default_encoding) 406 - except (subprocess.CalledProcessError, OSError): 407 - if sys.platform == 'win32': 408 - return 'x86_64-pc-windows-msvc' 409 - err = "uname not found" 410 - if self.verbose: 411 - raise Exception(err) 412 - sys.exit(err) 413 - 414 - # The goal here is to come up with the same triple as LLVM would, 415 - # at least for the subset of platforms we're willing to target. 416 - if ostype == 'Linux': 417 - os_from_sp = subprocess.check_output(['uname', '-o']).strip().decode(default_encoding) 418 - if os_from_sp == 'Android': 419 - ostype = 'linux-android' 420 - else: 421 - ostype = 'unknown-linux-gnu' 422 - elif ostype == 'FreeBSD': 423 - ostype = 'unknown-freebsd' 424 - elif ostype == 'DragonFly': 425 - ostype = 'unknown-dragonfly' 426 - elif ostype == 'Bitrig': 427 - ostype = 'unknown-bitrig' 428 - elif ostype == 'OpenBSD': 429 - ostype = 'unknown-openbsd' 430 - elif ostype == 'NetBSD': 431 - ostype = 'unknown-netbsd' 432 - elif ostype == 'SunOS': 433 - ostype = 'sun-solaris' 434 - # On Solaris, uname -m will return a machine classification instead 435 - # of a cpu type, so uname -p is recommended instead. However, the 436 - # output from that option is too generic for our purposes (it will 437 - # always emit 'i386' on x86/amd64 systems). As such, isainfo -k 438 - # must be used instead. 439 - try: 440 - cputype = subprocess.check_output(['isainfo', 441 - '-k']).strip().decode(default_encoding) 442 - except (subprocess.CalledProcessError, OSError): 443 - err = "isainfo not found" 444 - if self.verbose: 445 - raise Exception(err) 446 - sys.exit(err) 447 - elif ostype == 'Darwin': 448 - ostype = 'apple-darwin' 449 - elif ostype == 'Haiku': 450 - ostype = 'unknown-haiku' 451 - elif ostype.startswith('MINGW'): 452 - # msys' `uname` does not print gcc configuration, but prints msys 453 - # configuration. so we cannot believe `uname -m`: 454 - # msys1 is always i686 and msys2 is always x86_64. 455 - # instead, msys defines $MSYSTEM which is MINGW32 on i686 and 456 - # MINGW64 on x86_64. 457 - ostype = 'pc-windows-gnu' 458 - cputype = 'i686' 459 - if os.environ.get('MSYSTEM') == 'MINGW64': 460 - cputype = 'x86_64' 461 - elif ostype.startswith('MSYS'): 462 - ostype = 'pc-windows-gnu' 463 - elif ostype.startswith('CYGWIN_NT'): 464 - cputype = 'i686' 465 - if ostype.endswith('WOW64'): 466 - cputype = 'x86_64' 467 - ostype = 'pc-windows-gnu' 468 - else: 469 - err = "unknown OS type: " + ostype 470 - if self.verbose: 471 - raise ValueError(err) 472 - sys.exit(err) 473 - 474 - if cputype in {'i386', 'i486', 'i686', 'i786', 'x86'}: 475 - cputype = 'i686' 476 - elif cputype in {'xscale', 'arm'}: 477 - cputype = 'arm' 478 - if ostype == 'linux-android': 479 - ostype = 'linux-androideabi' 480 - elif cputype == 'armv6l': 481 - cputype = 'arm' 482 - if ostype == 'linux-android': 483 - ostype = 'linux-androideabi' 484 - else: 485 - ostype += 'eabihf' 486 - elif cputype in {'armv7l', 'armv8l'}: 487 - cputype = 'armv7' 488 - if ostype == 'linux-android': 489 - ostype = 'linux-androideabi' 490 - else: 491 - ostype += 'eabihf' 492 - elif cputype in {'aarch64', 'arm64'}: 493 - cputype = 'aarch64' 494 - elif cputype == 'mips': 495 - if sys.byteorder == 'big': 496 - cputype = 'mips' 497 - elif sys.byteorder == 'little': 498 - cputype = 'mipsel' 499 - else: 500 - raise ValueError('unknown byteorder: ' + sys.byteorder) 501 - elif cputype == 'mips64': 502 - if sys.byteorder == 'big': 503 - cputype = 'mips64' 504 - elif sys.byteorder == 'little': 505 - cputype = 'mips64el' 506 - else: 507 - raise ValueError('unknown byteorder: ' + sys.byteorder) 508 - # only the n64 ABI is supported, indicate it 509 - ostype += 'abi64' 510 - elif cputype in {'powerpc', 'ppc'}: 511 - cputype = 'powerpc' 512 - elif cputype in {'powerpc64', 'ppc64'}: 513 - cputype = 'powerpc64' 514 - elif cputype in {'powerpc64le', 'ppc64le'}: 515 - cputype = 'powerpc64le' 516 - elif cputype == 'sparcv9': 517 - pass 518 - elif cputype in {'amd64', 'x86_64', 'x86-64', 'x64'}: 519 - cputype = 'x86_64' 520 - elif cputype == 's390x': 521 - cputype = 's390x' 522 - elif cputype == 'BePC': 523 - cputype = 'i686' 524 - else: 525 - err = "unknown cpu type: " + cputype 526 - if self.verbose: 527 - raise ValueError(err) 528 - sys.exit(err) 529 - 530 - return "{}-{}".format(cputype, ostype) 531 - 532 -def bootstrap(): 533 - parser = argparse.ArgumentParser(description='Build rust') 534 - parser.add_argument('--config') 535 - parser.add_argument('--clean', action='store_true') 536 - parser.add_argument('-v', '--verbose', action='store_true') 537 - 538 - args = [a for a in sys.argv if a != '-h' and a != '--help'] 539 - args, _ = parser.parse_known_args(args) 540 - 541 - # Configure initial bootstrap 542 - rb = RustBuild() 543 - rb.config_toml = '' 544 - rb.config_mk = '' 545 - rb.rust_root = os.path.abspath(os.path.join(__file__, '../../..')) 546 - rb.build_dir = os.path.join(os.getcwd(), "build") 547 - rb.verbose = args.verbose 548 - rb.clean = args.clean 549 - 550 - try: 551 - with open(args.config or 'config.toml') as config: 552 - rb.config_toml = config.read() 553 - except: 554 - pass 555 - try: 556 - rb.config_mk = open('config.mk').read() 557 - except: 558 - pass 559 - 560 - rb.use_vendored_sources = '\nvendor = true' in rb.config_toml or \ 561 - 'CFG_ENABLE_VENDOR' in rb.config_mk 562 - 563 - rb.use_locked_deps = '\nlocked-deps = true' in rb.config_toml or \ 564 - 'CFG_ENABLE_LOCKED_DEPS' in rb.config_mk 565 - 566 - if 'SUDO_USER' in os.environ and not rb.use_vendored_sources: 567 - if os.environ.get('USER') != os.environ['SUDO_USER']: 568 - rb.use_vendored_sources = True 569 - print('info: looks like you are running this command under `sudo`') 570 - print(' and so in order to preserve your $HOME this will now') 571 - print(' use vendored sources by default. Note that if this') 572 - print(' does not work you should run a normal build first') 573 - print(' before running a command like `sudo make install`') 574 - 575 - if rb.use_vendored_sources: 576 - if not os.path.exists('.cargo'): 577 - os.makedirs('.cargo') 578 - with open('.cargo/config','w') as f: 579 - f.write(""" 580 - [source.crates-io] 581 - replace-with = 'vendored-sources' 582 - registry = 'https://example.com' 583 - 584 - [source.vendored-sources] 585 - directory = '{}/src/vendor' 586 - """.format(rb.rust_root)) 587 - else: 588 - if os.path.exists('.cargo'): 589 - shutil.rmtree('.cargo') 590 - 591 - data = stage0_data(rb.rust_root) 592 - rb._date = data['date'] 593 - rb._rustc_channel = data['rustc'] 594 - rb._cargo_channel = data['cargo'] 595 - if 'dev' in data: 596 - rb._download_url = 'https://dev-static.rust-lang.org' 597 - else: 598 - rb._download_url = 'https://static.rust-lang.org' 599 - 600 - # Fetch/build the bootstrap 601 - rb.build = rb.build_triple() 602 - rb.download_stage0() 603 - sys.stdout.flush() 604 - rb.build_bootstrap() 605 - sys.stdout.flush() 606 - 607 - # Run the bootstrap 608 - args = [rb.bootstrap_binary()] 609 - args.extend(sys.argv[1:]) 610 - env = os.environ.copy() 611 - env["BUILD"] = rb.build 612 - env["SRC"] = rb.rust_root 613 - env["BOOTSTRAP_PARENT_ID"] = str(os.getpid()) 614 - rb.run(args, env) 615 - 616 -def main(): 617 - start_time = time() 618 - help_triggered = ('-h' in sys.argv) or ('--help' in sys.argv) or (len(sys.argv) == 1) 619 - try: 620 - bootstrap() 621 - if not help_triggered: 622 - print("Build completed successfully in %s" % format_build_time(time() - start_time)) 623 - except (SystemExit, KeyboardInterrupt) as e: 624 - if hasattr(e, 'code') and isinstance(e.code, int): 625 - exit_code = e.code 626 - else: 627 - exit_code = 1 628 - print(e) 629 - if not help_triggered: 630 - print("Build completed unsuccessfully in %s" % format_build_time(time() - start_time)) 631 - sys.exit(exit_code) 632 - 633 -if __name__ == '__main__': 634 - main()
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/cc.rs version [2af5c09bc2].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! C-compiler probing and detection. 12 -//! 13 -//! This module will fill out the `cc` and `cxx` maps of `Build` by looking for 14 -//! C and C++ compilers for each target configured. A compiler is found through 15 -//! a number of vectors (in order of precedence) 16 -//! 17 -//! 1. Configuration via `target.$target.cc` in `config.toml`. 18 -//! 2. Configuration via `target.$target.android-ndk` in `config.toml`, if 19 -//! applicable 20 -//! 3. Special logic to probe on OpenBSD 21 -//! 4. The `CC_$target` environment variable. 22 -//! 5. The `CC` environment variable. 23 -//! 6. "cc" 24 -//! 25 -//! Some of this logic is implemented here, but much of it is farmed out to the 26 -//! `gcc` crate itself, so we end up having the same fallbacks as there. 27 -//! Similar logic is then used to find a C++ compiler, just some s/cc/c++/ is 28 -//! used. 29 -//! 30 -//! It is intended that after this module has run no C/C++ compiler will 31 -//! ever be probed for. Instead the compilers found here will be used for 32 -//! everything. 33 - 34 -use std::process::Command; 35 - 36 -use build_helper::{cc2ar, output}; 37 -use gcc; 38 - 39 -use Build; 40 -use config::Target; 41 - 42 -pub fn find(build: &mut Build) { 43 - // For all targets we're going to need a C compiler for building some shims 44 - // and such as well as for being a linker for Rust code. 45 - for target in build.config.target.iter() { 46 - let mut cfg = gcc::Config::new(); 47 - cfg.cargo_metadata(false).opt_level(0).debug(false) 48 - .target(target).host(&build.config.build); 49 - 50 - let config = build.config.target_config.get(target); 51 - if let Some(cc) = config.and_then(|c| c.cc.as_ref()) { 52 - cfg.compiler(cc); 53 - } else { 54 - set_compiler(&mut cfg, "gcc", target, config, build); 55 - } 56 - 57 - let compiler = cfg.get_compiler(); 58 - let ar = cc2ar(compiler.path(), target); 59 - build.verbose(&format!("CC_{} = {:?}", target, compiler.path())); 60 - if let Some(ref ar) = ar { 61 - build.verbose(&format!("AR_{} = {:?}", target, ar)); 62 - } 63 - build.cc.insert(target.to_string(), (compiler, ar)); 64 - } 65 - 66 - // For all host triples we need to find a C++ compiler as well 67 - for host in build.config.host.iter() { 68 - let mut cfg = gcc::Config::new(); 69 - cfg.cargo_metadata(false).opt_level(0).debug(false).cpp(true) 70 - .target(host).host(&build.config.build); 71 - let config = build.config.target_config.get(host); 72 - if let Some(cxx) = config.and_then(|c| c.cxx.as_ref()) { 73 - cfg.compiler(cxx); 74 - } else { 75 - set_compiler(&mut cfg, "g++", host, config, build); 76 - } 77 - let compiler = cfg.get_compiler(); 78 - build.verbose(&format!("CXX_{} = {:?}", host, compiler.path())); 79 - build.cxx.insert(host.to_string(), compiler); 80 - } 81 -} 82 - 83 -fn set_compiler(cfg: &mut gcc::Config, 84 - gnu_compiler: &str, 85 - target: &str, 86 - config: Option<&Target>, 87 - build: &Build) { 88 - match target { 89 - // When compiling for android we may have the NDK configured in the 90 - // config.toml in which case we look there. Otherwise the default 91 - // compiler already takes into account the triple in question. 92 - t if t.contains("android") => { 93 - if let Some(ndk) = config.and_then(|c| c.ndk.as_ref()) { 94 - let target = target.replace("armv7", "arm"); 95 - let compiler = format!("{}-{}", target, gnu_compiler); 96 - cfg.compiler(ndk.join("bin").join(compiler)); 97 - } 98 - } 99 - 100 - // The default gcc version from OpenBSD may be too old, try using egcc, 101 - // which is a gcc version from ports, if this is the case. 102 - t if t.contains("openbsd") => { 103 - let c = cfg.get_compiler(); 104 - if !c.path().ends_with(gnu_compiler) { 105 - return 106 - } 107 - 108 - let output = output(c.to_command().arg("--version")); 109 - let i = match output.find(" 4.") { 110 - Some(i) => i, 111 - None => return, 112 - }; 113 - match output[i + 3..].chars().next().unwrap() { 114 - '0' ... '6' => {} 115 - _ => return, 116 - } 117 - let alternative = format!("e{}", gnu_compiler); 118 - if Command::new(&alternative).output().is_ok() { 119 - cfg.compiler(alternative); 120 - } 121 - } 122 - 123 - "mips-unknown-linux-musl" => { 124 - if cfg.get_compiler().path().to_str() == Some("gcc") { 125 - cfg.compiler("mips-linux-musl-gcc"); 126 - } 127 - } 128 - "mipsel-unknown-linux-musl" => { 129 - if cfg.get_compiler().path().to_str() == Some("gcc") { 130 - cfg.compiler("mipsel-linux-musl-gcc"); 131 - } 132 - } 133 - 134 - t if t.contains("musl") => { 135 - if let Some(root) = build.musl_root(target) { 136 - let guess = root.join("bin/musl-gcc"); 137 - if guess.exists() { 138 - cfg.compiler(guess); 139 - } 140 - } 141 - } 142 - 143 - _ => {} 144 - } 145 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/channel.rs version [4119afcb79].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Build configuration for Rust's release channels. 12 -//! 13 -//! Implements the stable/beta/nightly channel distinctions by setting various 14 -//! flags like the `unstable_features`, calculating variables like `release` and 15 -//! `package_vers`, and otherwise indicating to the compiler what it should 16 -//! print out as part of its version information. 17 - 18 -use std::path::Path; 19 -use std::process::Command; 20 - 21 -use build_helper::output; 22 - 23 -use Build; 24 - 25 -// The version number 26 -pub const CFG_RELEASE_NUM: &'static str = "1.19.0"; 27 - 28 -// An optional number to put after the label, e.g. '.2' -> '-beta.2' 29 -// Be sure to make this starts with a dot to conform to semver pre-release 30 -// versions (section 9) 31 -pub const CFG_PRERELEASE_VERSION: &'static str = ".1"; 32 - 33 -pub struct GitInfo { 34 - inner: Option<Info>, 35 -} 36 - 37 -struct Info { 38 - commit_date: String, 39 - sha: String, 40 - short_sha: String, 41 -} 42 - 43 -impl GitInfo { 44 - pub fn new(dir: &Path) -> GitInfo { 45 - // See if this even begins to look like a git dir 46 - if !dir.join(".git").exists() { 47 - return GitInfo { inner: None } 48 - } 49 - 50 - // Make sure git commands work 51 - let out = Command::new("git") 52 - .arg("rev-parse") 53 - .current_dir(dir) 54 - .output() 55 - .expect("failed to spawn git"); 56 - if !out.status.success() { 57 - return GitInfo { inner: None } 58 - } 59 - 60 - // Ok, let's scrape some info 61 - let ver_date = output(Command::new("git").current_dir(dir) 62 - .arg("log").arg("-1") 63 - .arg("--date=short") 64 - .arg("--pretty=format:%cd")); 65 - let ver_hash = output(Command::new("git").current_dir(dir) 66 - .arg("rev-parse").arg("HEAD")); 67 - let short_ver_hash = output(Command::new("git") 68 - .current_dir(dir) 69 - .arg("rev-parse") 70 - .arg("--short=9") 71 - .arg("HEAD")); 72 - GitInfo { 73 - inner: Some(Info { 74 - commit_date: ver_date.trim().to_string(), 75 - sha: ver_hash.trim().to_string(), 76 - short_sha: short_ver_hash.trim().to_string(), 77 - }), 78 - } 79 - } 80 - 81 - pub fn sha(&self) -> Option<&str> { 82 - self.inner.as_ref().map(|s| &s.sha[..]) 83 - } 84 - 85 - pub fn sha_short(&self) -> Option<&str> { 86 - self.inner.as_ref().map(|s| &s.short_sha[..]) 87 - } 88 - 89 - pub fn commit_date(&self) -> Option<&str> { 90 - self.inner.as_ref().map(|s| &s.commit_date[..]) 91 - } 92 - 93 - pub fn version(&self, build: &Build, num: &str) -> String { 94 - let mut version = build.release(num); 95 - if let Some(ref inner) = self.inner { 96 - version.push_str(" ("); 97 - version.push_str(&inner.short_sha); 98 - version.push_str(" "); 99 - version.push_str(&inner.commit_date); 100 - version.push_str(")"); 101 - } 102 - return version 103 - } 104 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/check.rs version [8878728b74].
1 -// Copyright 2016 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Implementation of the test-related targets of the build system. 12 -//! 13 -//! This file implements the various regression test suites that we execute on 14 -//! our CI. 15 - 16 -extern crate build_helper; 17 - 18 -use std::collections::HashSet; 19 -use std::env; 20 -use std::fmt; 21 -use std::fs; 22 -use std::path::{PathBuf, Path}; 23 -use std::process::Command; 24 - 25 -use build_helper::output; 26 - 27 -use {Build, Compiler, Mode}; 28 -use dist; 29 -use util::{self, dylib_path, dylib_path_var, exe}; 30 - 31 -const ADB_TEST_DIR: &'static str = "/data/tmp/work"; 32 - 33 -/// The two modes of the test runner; tests or benchmarks. 34 -#[derive(Copy, Clone)] 35 -pub enum TestKind { 36 - /// Run `cargo test` 37 - Test, 38 - /// Run `cargo bench` 39 - Bench, 40 -} 41 - 42 -impl TestKind { 43 - // Return the cargo subcommand for this test kind 44 - fn subcommand(self) -> &'static str { 45 - match self { 46 - TestKind::Test => "test", 47 - TestKind::Bench => "bench", 48 - } 49 - } 50 -} 51 - 52 -impl fmt::Display for TestKind { 53 - fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { 54 - f.write_str(match *self { 55 - TestKind::Test => "Testing", 56 - TestKind::Bench => "Benchmarking", 57 - }) 58 - } 59 -} 60 - 61 -/// Runs the `linkchecker` tool as compiled in `stage` by the `host` compiler. 62 -/// 63 -/// This tool in `src/tools` will verify the validity of all our links in the 64 -/// documentation to ensure we don't have a bunch of dead ones. 65 -pub fn linkcheck(build: &Build, host: &str) { 66 - println!("Linkcheck ({})", host); 67 - let compiler = Compiler::new(0, host); 68 - 69 - let _time = util::timeit(); 70 - build.run(build.tool_cmd(&compiler, "linkchecker") 71 - .arg(build.out.join(host).join("doc"))); 72 -} 73 - 74 -/// Runs the `cargotest` tool as compiled in `stage` by the `host` compiler. 75 -/// 76 -/// This tool in `src/tools` will check out a few Rust projects and run `cargo 77 -/// test` to ensure that we don't regress the test suites there. 78 -pub fn cargotest(build: &Build, stage: u32, host: &str) { 79 - let ref compiler = Compiler::new(stage, host); 80 - 81 - // Note that this is a short, cryptic, and not scoped directory name. This 82 - // is currently to minimize the length of path on Windows where we otherwise 83 - // quickly run into path name limit constraints. 84 - let out_dir = build.out.join("ct"); 85 - t!(fs::create_dir_all(&out_dir)); 86 - 87 - let _time = util::timeit(); 88 - let mut cmd = Command::new(build.tool(&Compiler::new(0, host), "cargotest")); 89 - build.prepare_tool_cmd(compiler, &mut cmd); 90 - build.run(cmd.arg(&build.cargo) 91 - .arg(&out_dir) 92 - .env("RUSTC", build.compiler_path(compiler)) 93 - .env("RUSTDOC", build.rustdoc(compiler))) 94 -} 95 - 96 -/// Runs `cargo test` for `cargo` packaged with Rust. 97 -pub fn cargo(build: &Build, stage: u32, host: &str) { 98 - let ref compiler = Compiler::new(stage, host); 99 - 100 - // Configure PATH to find the right rustc. NB. we have to use PATH 101 - // and not RUSTC because the Cargo test suite has tests that will 102 - // fail if rustc is not spelled `rustc`. 103 - let path = build.sysroot(compiler).join("bin"); 104 - let old_path = ::std::env::var("PATH").expect(""); 105 - let sep = if cfg!(windows) { ";" } else {":" }; 106 - let ref newpath = format!("{}{}{}", path.display(), sep, old_path); 107 - 108 - let mut cargo = build.cargo(compiler, Mode::Tool, host, "test"); 109 - cargo.arg("--manifest-path").arg(build.src.join("src/tools/cargo/Cargo.toml")); 110 - 111 - // Don't build tests dynamically, just a pain to work with 112 - cargo.env("RUSTC_NO_PREFER_DYNAMIC", "1"); 113 - 114 - // Don't run cross-compile tests, we may not have cross-compiled libstd libs 115 - // available. 116 - cargo.env("CFG_DISABLE_CROSS_TESTS", "1"); 117 - 118 - build.run(cargo.env("PATH", newpath)); 119 -} 120 - 121 -/// Runs the `tidy` tool as compiled in `stage` by the `host` compiler. 122 -/// 123 -/// This tool in `src/tools` checks up on various bits and pieces of style and 124 -/// otherwise just implements a few lint-like checks that are specific to the 125 -/// compiler itself. 126 -pub fn tidy(build: &Build, host: &str) { 127 - println!("tidy check ({})", host); 128 - let compiler = Compiler::new(0, host); 129 - let mut cmd = build.tool_cmd(&compiler, "tidy"); 130 - cmd.arg(build.src.join("src")); 131 - if !build.config.vendor { 132 - cmd.arg("--no-vendor"); 133 - } 134 - build.run(&mut cmd); 135 -} 136 - 137 -fn testdir(build: &Build, host: &str) -> PathBuf { 138 - build.out.join(host).join("test") 139 -} 140 - 141 -/// Executes the `compiletest` tool to run a suite of tests. 142 -/// 143 -/// Compiles all tests with `compiler` for `target` with the specified 144 -/// compiletest `mode` and `suite` arguments. For example `mode` can be 145 -/// "run-pass" or `suite` can be something like `debuginfo`. 146 -pub fn compiletest(build: &Build, 147 - compiler: &Compiler, 148 - target: &str, 149 - mode: &str, 150 - suite: &str) { 151 - println!("Check compiletest suite={} mode={} ({} -> {})", 152 - suite, mode, compiler.host, target); 153 - let mut cmd = Command::new(build.tool(&Compiler::new(0, compiler.host), 154 - "compiletest")); 155 - build.prepare_tool_cmd(compiler, &mut cmd); 156 - 157 - // compiletest currently has... a lot of arguments, so let's just pass all 158 - // of them! 159 - 160 - cmd.arg("--compile-lib-path").arg(build.rustc_libdir(compiler)); 161 - cmd.arg("--run-lib-path").arg(build.sysroot_libdir(compiler, target)); 162 - cmd.arg("--rustc-path").arg(build.compiler_path(compiler)); 163 - cmd.arg("--rustdoc-path").arg(build.rustdoc(compiler)); 164 - cmd.arg("--src-base").arg(build.src.join("src/test").join(suite)); 165 - cmd.arg("--build-base").arg(testdir(build, compiler.host).join(suite)); 166 - cmd.arg("--stage-id").arg(format!("stage{}-{}", compiler.stage, target)); 167 - cmd.arg("--mode").arg(mode); 168 - cmd.arg("--target").arg(target); 169 - cmd.arg("--host").arg(compiler.host); 170 - cmd.arg("--llvm-filecheck").arg(build.llvm_filecheck(&build.config.build)); 171 - 172 - if let Some(nodejs) = build.config.nodejs.as_ref() { 173 - cmd.arg("--nodejs").arg(nodejs); 174 - } 175 - 176 - let mut flags = vec!["-Crpath".to_string()]; 177 - if build.config.rust_optimize_tests { 178 - flags.push("-O".to_string()); 179 - } 180 - if build.config.rust_debuginfo_tests { 181 - flags.push("-g".to_string()); 182 - } 183 - 184 - let mut hostflags = build.rustc_flags(&compiler.host); 185 - hostflags.extend(flags.clone()); 186 - cmd.arg("--host-rustcflags").arg(hostflags.join(" ")); 187 - 188 - let mut targetflags = build.rustc_flags(&target); 189 - targetflags.extend(flags); 190 - targetflags.push(format!("-Lnative={}", 191 - build.test_helpers_out(target).display())); 192 - cmd.arg("--target-rustcflags").arg(targetflags.join(" ")); 193 - 194 - cmd.arg("--docck-python").arg(build.python()); 195 - 196 - if build.config.build.ends_with("apple-darwin") { 197 - // Force /usr/bin/python on macOS for LLDB tests because we're loading the 198 - // LLDB plugin's compiled module which only works with the system python 199 - // (namely not Homebrew-installed python) 200 - cmd.arg("--lldb-python").arg("/usr/bin/python"); 201 - } else { 202 - cmd.arg("--lldb-python").arg(build.python()); 203 - } 204 - 205 - if let Some(ref gdb) = build.config.gdb { 206 - cmd.arg("--gdb").arg(gdb); 207 - } 208 - if let Some(ref vers) = build.lldb_version { 209 - cmd.arg("--lldb-version").arg(vers); 210 - } 211 - if let Some(ref dir) = build.lldb_python_dir { 212 - cmd.arg("--lldb-python-dir").arg(dir); 213 - } 214 - let llvm_config = build.llvm_config(target); 215 - let llvm_version = output(Command::new(&llvm_config).arg("--version")); 216 - cmd.arg("--llvm-version").arg(llvm_version); 217 - 218 - cmd.args(&build.flags.cmd.test_args()); 219 - 220 - if build.config.verbose() || build.flags.verbose() { 221 - cmd.arg("--verbose"); 222 - } 223 - 224 - if build.config.quiet_tests { 225 - cmd.arg("--quiet"); 226 - } 227 - 228 - // Only pass correct values for these flags for the `run-make` suite as it 229 - // requires that a C++ compiler was configured which isn't always the case. 230 - if suite == "run-make" { 231 - let llvm_components = output(Command::new(&llvm_config).arg("--components")); 232 - let llvm_cxxflags = output(Command::new(&llvm_config).arg("--cxxflags")); 233 - cmd.arg("--cc").arg(build.cc(target)) 234 - .arg("--cxx").arg(build.cxx(target)) 235 - .arg("--cflags").arg(build.cflags(target).join(" ")) 236 - .arg("--llvm-components").arg(llvm_components.trim()) 237 - .arg("--llvm-cxxflags").arg(llvm_cxxflags.trim()); 238 - } else { 239 - cmd.arg("--cc").arg("") 240 - .arg("--cxx").arg("") 241 - .arg("--cflags").arg("") 242 - .arg("--llvm-components").arg("") 243 - .arg("--llvm-cxxflags").arg(""); 244 - } 245 - 246 - if build.remote_tested(target) { 247 - cmd.arg("--remote-test-client") 248 - .arg(build.tool(&Compiler::new(0, &build.config.build), 249 - "remote-test-client")); 250 - } 251 - 252 - // Running a C compiler on MSVC requires a few env vars to be set, to be 253 - // sure to set them here. 254 - // 255 - // Note that if we encounter `PATH` we make sure to append to our own `PATH` 256 - // rather than stomp over it. 257 - if target.contains("msvc") { 258 - for &(ref k, ref v) in build.cc[target].0.env() { 259 - if k != "PATH" { 260 - cmd.env(k, v); 261 - } 262 - } 263 - } 264 - cmd.env("RUSTC_BOOTSTRAP", "1"); 265 - build.add_rust_test_threads(&mut cmd); 266 - 267 - if build.config.sanitizers { 268 - cmd.env("SANITIZER_SUPPORT", "1"); 269 - } 270 - 271 - cmd.arg("--adb-path").arg("adb"); 272 - cmd.arg("--adb-test-dir").arg(ADB_TEST_DIR); 273 - if target.contains("android") { 274 - // Assume that cc for this target comes from the android sysroot 275 - cmd.arg("--android-cross-path") 276 - .arg(build.cc(target).parent().unwrap().parent().unwrap()); 277 - } else { 278 - cmd.arg("--android-cross-path").arg(""); 279 - } 280 - 281 - let _time = util::timeit(); 282 - build.run(&mut cmd); 283 -} 284 - 285 -/// Run `rustdoc --test` for all documentation in `src/doc`. 286 -/// 287 -/// This will run all tests in our markdown documentation (e.g. the book) 288 -/// located in `src/doc`. The `rustdoc` that's run is the one that sits next to 289 -/// `compiler`. 290 -pub fn docs(build: &Build, compiler: &Compiler) { 291 - // Do a breadth-first traversal of the `src/doc` directory and just run 292 - // tests for all files that end in `*.md` 293 - let mut stack = vec![build.src.join("src/doc")]; 294 - let _time = util::timeit(); 295 - 296 - while let Some(p) = stack.pop() { 297 - if p.is_dir() { 298 - stack.extend(t!(p.read_dir()).map(|p| t!(p).path())); 299 - continue 300 - } 301 - 302 - if p.extension().and_then(|s| s.to_str()) != Some("md") { 303 - continue 304 - } 305 - 306 - // The nostarch directory in the book is for no starch, and so isn't guaranteed to build. 307 - // we don't care if it doesn't build, so skip it. 308 - use std::ffi::OsStr; 309 - let path: &OsStr = p.as_ref(); 310 - if let Some(path) = path.to_str() { 311 - if path.contains("nostarch") { 312 - continue; 313 - } 314 - } 315 - 316 - println!("doc tests for: {}", p.display()); 317 - markdown_test(build, compiler, &p); 318 - } 319 -} 320 - 321 -/// Run the error index generator tool to execute the tests located in the error 322 -/// index. 323 -/// 324 -/// The `error_index_generator` tool lives in `src/tools` and is used to 325 -/// generate a markdown file from the error indexes of the code base which is 326 -/// then passed to `rustdoc --test`. 327 -pub fn error_index(build: &Build, compiler: &Compiler) { 328 - println!("Testing error-index stage{}", compiler.stage); 329 - 330 - let dir = testdir(build, compiler.host); 331 - t!(fs::create_dir_all(&dir)); 332 - let output = dir.join("error-index.md"); 333 - 334 - let _time = util::timeit(); 335 - build.run(build.tool_cmd(&Compiler::new(0, compiler.host), 336 - "error_index_generator") 337 - .arg("markdown") 338 - .arg(&output) 339 - .env("CFG_BUILD", &build.config.build)); 340 - 341 - markdown_test(build, compiler, &output); 342 -} 343 - 344 -fn markdown_test(build: &Build, compiler: &Compiler, markdown: &Path) { 345 - let mut cmd = Command::new(build.rustdoc(compiler)); 346 - build.add_rustc_lib_path(compiler, &mut cmd); 347 - build.add_rust_test_threads(&mut cmd); 348 - cmd.arg("--test"); 349 - cmd.arg(markdown); 350 - cmd.env("RUSTC_BOOTSTRAP", "1"); 351 - 352 - let mut test_args = build.flags.cmd.test_args().join(" "); 353 - if build.config.quiet_tests { 354 - test_args.push_str(" --quiet"); 355 - } 356 - cmd.arg("--test-args").arg(test_args); 357 - 358 - build.run(&mut cmd); 359 -} 360 - 361 -/// Run all unit tests plus documentation tests for an entire crate DAG defined 362 -/// by a `Cargo.toml` 363 -/// 364 -/// This is what runs tests for crates like the standard library, compiler, etc. 365 -/// It essentially is the driver for running `cargo test`. 366 -/// 367 -/// Currently this runs all tests for a DAG by passing a bunch of `-p foo` 368 -/// arguments, and those arguments are discovered from `cargo metadata`. 369 -pub fn krate(build: &Build, 370 - compiler: &Compiler, 371 - target: &str, 372 - mode: Mode, 373 - test_kind: TestKind, 374 - krate: Option<&str>) { 375 - let (name, path, features, root) = match mode { 376 - Mode::Libstd => { 377 - ("libstd", "src/libstd", build.std_features(), "std") 378 - } 379 - Mode::Libtest => { 380 - ("libtest", "src/libtest", String::new(), "test") 381 - } 382 - Mode::Librustc => { 383 - ("librustc", "src/rustc", build.rustc_features(), "rustc-main") 384 - } 385 - _ => panic!("can only test libraries"), 386 - }; 387 - println!("{} {} stage{} ({} -> {})", test_kind, name, compiler.stage, 388 - compiler.host, target); 389 - 390 - // If we're not doing a full bootstrap but we're testing a stage2 version of 391 - // libstd, then what we're actually testing is the libstd produced in 392 - // stage1. Reflect that here by updating the compiler that we're working 393 - // with automatically. 394 - let compiler = if build.force_use_stage1(compiler, target) { 395 - Compiler::new(1, compiler.host) 396 - } else { 397 - compiler.clone() 398 - }; 399 - 400 - // Build up the base `cargo test` command. 401 - // 402 - // Pass in some standard flags then iterate over the graph we've discovered 403 - // in `cargo metadata` with the maps above and figure out what `-p` 404 - // arguments need to get passed. 405 - let mut cargo = build.cargo(&compiler, mode, target, test_kind.subcommand()); 406 - cargo.arg("--manifest-path") 407 - .arg(build.src.join(path).join("Cargo.toml")) 408 - .arg("--features").arg(features); 409 - 410 - match krate { 411 - Some(krate) => { 412 - cargo.arg("-p").arg(krate); 413 - } 414 - None => { 415 - let mut visited = HashSet::new(); 416 - let mut next = vec![root]; 417 - while let Some(name) = next.pop() { 418 - // Right now jemalloc is our only target-specific crate in the 419 - // sense that it's not present on all platforms. Custom skip it 420 - // here for now, but if we add more this probably wants to get 421 - // more generalized. 422 - // 423 - // Also skip `build_helper` as it's not compiled normally for 424 - // target during the bootstrap and it's just meant to be a 425 - // helper crate, not tested. If it leaks through then it ends up 426 - // messing with various mtime calculations and such. 427 - if !name.contains("jemalloc") && name != "build_helper" { 428 - cargo.arg("-p").arg(&format!("{}:0.0.0", name)); 429 - } 430 - for dep in build.crates[name].deps.iter() { 431 - if visited.insert(dep) { 432 - next.push(dep); 433 - } 434 - } 435 - } 436 - } 437 - } 438 - 439 - // The tests are going to run with the *target* libraries, so we need to 440 - // ensure that those libraries show up in the LD_LIBRARY_PATH equivalent. 441 - // 442 - // Note that to run the compiler we need to run with the *host* libraries, 443 - // but our wrapper scripts arrange for that to be the case anyway. 444 - let mut dylib_path = dylib_path(); 445 - dylib_path.insert(0, build.sysroot_libdir(&compiler, target)); 446 - cargo.env(dylib_path_var(), env::join_paths(&dylib_path).unwrap()); 447 - 448 - if target.contains("emscripten") || build.remote_tested(target) { 449 - cargo.arg("--no-run"); 450 - } 451 - 452 - cargo.arg("--"); 453 - 454 - if build.config.quiet_tests { 455 - cargo.arg("--quiet"); 456 - } 457 - 458 - let _time = util::timeit(); 459 - 460 - if target.contains("emscripten") { 461 - build.run(&mut cargo); 462 - krate_emscripten(build, &compiler, target, mode); 463 - } else if build.remote_tested(target) { 464 - build.run(&mut cargo); 465 - krate_remote(build, &compiler, target, mode); 466 - } else { 467 - cargo.args(&build.flags.cmd.test_args()); 468 - build.run(&mut cargo); 469 - } 470 -} 471 - 472 -fn krate_emscripten(build: &Build, 473 - compiler: &Compiler, 474 - target: &str, 475 - mode: Mode) { 476 - let mut tests = Vec::new(); 477 - let out_dir = build.cargo_out(compiler, mode, target); 478 - find_tests(&out_dir.join("deps"), target, &mut tests); 479 - 480 - for test in tests { 481 - let test_file_name = test.to_string_lossy().into_owned(); 482 - println!("running {}", test_file_name); 483 - let nodejs = build.config.nodejs.as_ref().expect("nodejs not configured"); 484 - let mut cmd = Command::new(nodejs); 485 - cmd.arg(&test_file_name); 486 - if build.config.quiet_tests { 487 - cmd.arg("--quiet"); 488 - } 489 - build.run(&mut cmd); 490 - } 491 -} 492 - 493 -fn krate_remote(build: &Build, 494 - compiler: &Compiler, 495 - target: &str, 496 - mode: Mode) { 497 - let mut tests = Vec::new(); 498 - let out_dir = build.cargo_out(compiler, mode, target); 499 - find_tests(&out_dir.join("deps"), target, &mut tests); 500 - 501 - let tool = build.tool(&Compiler::new(0, &build.config.build), 502 - "remote-test-client"); 503 - for test in tests { 504 - let mut cmd = Command::new(&tool); 505 - cmd.arg("run") 506 - .arg(&test); 507 - if build.config.quiet_tests { 508 - cmd.arg("--quiet"); 509 - } 510 - cmd.args(&build.flags.cmd.test_args()); 511 - build.run(&mut cmd); 512 - } 513 -} 514 - 515 -fn find_tests(dir: &Path, 516 - target: &str, 517 - dst: &mut Vec<PathBuf>) { 518 - for e in t!(dir.read_dir()).map(|e| t!(e)) { 519 - let file_type = t!(e.file_type()); 520 - if !file_type.is_file() { 521 - continue 522 - } 523 - let filename = e.file_name().into_string().unwrap(); 524 - if (target.contains("windows") && filename.ends_with(".exe")) || 525 - (!target.contains("windows") && !filename.contains(".")) || 526 - (target.contains("emscripten") && filename.ends_with(".js")) { 527 - dst.push(e.path()); 528 - } 529 - } 530 -} 531 - 532 -pub fn remote_copy_libs(build: &Build, compiler: &Compiler, target: &str) { 533 - if !build.remote_tested(target) { 534 - return 535 - } 536 - 537 - println!("REMOTE copy libs to emulator ({})", target); 538 - t!(fs::create_dir_all(build.out.join("tmp"))); 539 - 540 - let server = build.cargo_out(compiler, Mode::Tool, target) 541 - .join(exe("remote-test-server", target)); 542 - 543 - // Spawn the emulator and wait for it to come online 544 - let tool = build.tool(&Compiler::new(0, &build.config.build), 545 - "remote-test-client"); 546 - let mut cmd = Command::new(&tool); 547 - cmd.arg("spawn-emulator") 548 - .arg(target) 549 - .arg(&server) 550 - .arg(build.out.join("tmp")); 551 - if let Some(rootfs) = build.qemu_rootfs(target) { 552 - cmd.arg(rootfs); 553 - } 554 - build.run(&mut cmd); 555 - 556 - // Push all our dylibs to the emulator 557 - for f in t!(build.sysroot_libdir(compiler, target).read_dir()) { 558 - let f = t!(f); 559 - let name = f.file_name().into_string().unwrap(); 560 - if util::is_dylib(&name) { 561 - build.run(Command::new(&tool) 562 - .arg("push") 563 - .arg(f.path())); 564 - } 565 - } 566 -} 567 - 568 -/// Run "distcheck", a 'make check' from a tarball 569 -pub fn distcheck(build: &Build) { 570 - if build.config.build != "x86_64-unknown-linux-gnu" { 571 - return 572 - } 573 - if !build.config.host.iter().any(|s| s == "x86_64-unknown-linux-gnu") { 574 - return 575 - } 576 - if !build.config.target.iter().any(|s| s == "x86_64-unknown-linux-gnu") { 577 - return 578 - } 579 - 580 - println!("Distcheck"); 581 - let dir = build.out.join("tmp").join("distcheck"); 582 - let _ = fs::remove_dir_all(&dir); 583 - t!(fs::create_dir_all(&dir)); 584 - 585 - let mut cmd = Command::new("tar"); 586 - cmd.arg("-xzf") 587 - .arg(dist::rust_src_location(build)) 588 - .arg("--strip-components=1") 589 - .current_dir(&dir); 590 - build.run(&mut cmd); 591 - build.run(Command::new("./configure") 592 - .args(&build.config.configure_args) 593 - .arg("--enable-vendor") 594 - .current_dir(&dir)); 595 - build.run(Command::new(build_helper::make(&build.config.build)) 596 - .arg("check") 597 - .current_dir(&dir)); 598 - 599 - // Now make sure that rust-src has all of libstd's dependencies 600 - println!("Distcheck rust-src"); 601 - let dir = build.out.join("tmp").join("distcheck-src"); 602 - let _ = fs::remove_dir_all(&dir); 603 - t!(fs::create_dir_all(&dir)); 604 - 605 - let mut cmd = Command::new("tar"); 606 - cmd.arg("-xzf") 607 - .arg(dist::rust_src_installer(build)) 608 - .arg("--strip-components=1") 609 - .current_dir(&dir); 610 - build.run(&mut cmd); 611 - 612 - let toml = dir.join("rust-src/lib/rustlib/src/rust/src/libstd/Cargo.toml"); 613 - build.run(Command::new(&build.cargo) 614 - .arg("generate-lockfile") 615 - .arg("--manifest-path") 616 - .arg(&toml) 617 - .current_dir(&dir)); 618 -} 619 - 620 -/// Test the build system itself 621 -pub fn bootstrap(build: &Build) { 622 - let mut cmd = Command::new(&build.cargo); 623 - cmd.arg("test") 624 - .current_dir(build.src.join("src/bootstrap")) 625 - .env("CARGO_TARGET_DIR", build.out.join("bootstrap")) 626 - .env("RUSTC", &build.rustc); 627 - cmd.arg("--").args(&build.flags.cmd.test_args()); 628 - build.run(&mut cmd); 629 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/clean.rs version [8c3e7c98be].
1 -// Copyright 2016 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Implementation of `make clean` in rustbuild. 12 -//! 13 -//! Responsible for cleaning out a build directory of all old and stale 14 -//! artifacts to prepare for a fresh build. Currently doesn't remove the 15 -//! `build/cache` directory (download cache) or the `build/$target/llvm` 16 -//! directory as we want that cached between builds. 17 - 18 -use std::fs; 19 -use std::io::{self, ErrorKind}; 20 -use std::path::Path; 21 - 22 -use Build; 23 - 24 -pub fn clean(build: &Build) { 25 - rm_rf("tmp".as_ref()); 26 - rm_rf(&build.out.join("tmp")); 27 - rm_rf(&build.out.join("dist")); 28 - 29 - for host in build.config.host.iter() { 30 - let entries = match build.out.join(host).read_dir() { 31 - Ok(iter) => iter, 32 - Err(_) => continue, 33 - }; 34 - 35 - for entry in entries { 36 - let entry = t!(entry); 37 - if entry.file_name().to_str() == Some("llvm") { 38 - continue 39 - } 40 - let path = t!(entry.path().canonicalize()); 41 - rm_rf(&path); 42 - } 43 - } 44 -} 45 - 46 -fn rm_rf(path: &Path) { 47 - match path.symlink_metadata() { 48 - Err(e) => { 49 - if e.kind() == ErrorKind::NotFound { 50 - return; 51 - } 52 - panic!("failed to get metadata for file {}: {}", path.display(), e); 53 - }, 54 - Ok(metadata) => { 55 - if metadata.file_type().is_file() || metadata.file_type().is_symlink() { 56 - do_op(path, "remove file", |p| fs::remove_file(p)); 57 - return; 58 - } 59 - 60 - for file in t!(fs::read_dir(path)) { 61 - rm_rf(&t!(file).path()); 62 - } 63 - do_op(path, "remove dir", |p| fs::remove_dir(p)); 64 - }, 65 - }; 66 -} 67 - 68 -fn do_op<F>(path: &Path, desc: &str, mut f: F) 69 - where F: FnMut(&Path) -> io::Result<()> 70 -{ 71 - match f(path) { 72 - Ok(()) => {} 73 - // On windows we can't remove a readonly file, and git will often clone files as readonly. 74 - // As a result, we have some special logic to remove readonly files on windows. 75 - // This is also the reason that we can't use things like fs::remove_dir_all(). 76 - Err(ref e) if cfg!(windows) && 77 - e.kind() == ErrorKind::PermissionDenied => { 78 - let mut p = t!(path.symlink_metadata()).permissions(); 79 - p.set_readonly(false); 80 - t!(fs::set_permissions(path, p)); 81 - f(path).unwrap_or_else(|e| { 82 - panic!("failed to {} {}: {}", desc, path.display(), e); 83 - }) 84 - } 85 - Err(e) => { 86 - panic!("failed to {} {}: {}", desc, path.display(), e); 87 - } 88 - } 89 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/compile.rs version [15e722bbd5].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Implementation of compiling various phases of the compiler and standard 12 -//! library. 13 -//! 14 -//! This module contains some of the real meat in the rustbuild build system 15 -//! which is where Cargo is used to compiler the standard library, libtest, and 16 -//! compiler. This module is also responsible for assembling the sysroot as it 17 -//! goes along from the output of the previous stage. 18 - 19 -use std::collections::HashMap; 20 -use std::fs::{self, File}; 21 -use std::path::{Path, PathBuf}; 22 -use std::process::Command; 23 -use std::env; 24 - 25 -use build_helper::{output, mtime, up_to_date}; 26 -use filetime::FileTime; 27 - 28 -use channel::GitInfo; 29 -use util::{exe, libdir, is_dylib, copy}; 30 -use {Build, Compiler, Mode}; 31 - 32 -/// Build the standard library. 33 -/// 34 -/// This will build the standard library for a particular stage of the build 35 -/// using the `compiler` targeting the `target` architecture. The artifacts 36 -/// created will also be linked into the sysroot directory. 37 -pub fn std(build: &Build, target: &str, compiler: &Compiler) { 38 - let libdir = build.sysroot_libdir(compiler, target); 39 - t!(fs::create_dir_all(&libdir)); 40 - 41 - println!("Building stage{} std artifacts ({} -> {})", compiler.stage, 42 - compiler.host, target); 43 - 44 - let out_dir = build.cargo_out(compiler, Mode::Libstd, target); 45 - build.clear_if_dirty(&out_dir, &build.compiler_path(compiler)); 46 - let mut cargo = build.cargo(compiler, Mode::Libstd, target, "build"); 47 - let mut features = build.std_features(); 48 - 49 - if let Ok(target) = env::var("MACOSX_STD_DEPLOYMENT_TARGET") { 50 - cargo.env("MACOSX_DEPLOYMENT_TARGET", target); 51 - } 52 - 53 - // When doing a local rebuild we tell cargo that we're stage1 rather than 54 - // stage0. This works fine if the local rust and being-built rust have the 55 - // same view of what the default allocator is, but fails otherwise. Since 56 - // we don't have a way to express an allocator preference yet, work 57 - // around the issue in the case of a local rebuild with jemalloc disabled. 58 - if compiler.stage == 0 && build.local_rebuild && !build.config.use_jemalloc { 59 - features.push_str(" force_alloc_system"); 60 - } 61 - 62 - if compiler.stage != 0 && build.config.sanitizers { 63 - // This variable is used by the sanitizer runtime crates, e.g. 64 - // rustc_lsan, to build the sanitizer runtime from C code 65 - // When this variable is missing, those crates won't compile the C code, 66 - // so we don't set this variable during stage0 where llvm-config is 67 - // missing 68 - // We also only build the runtimes when --enable-sanitizers (or its 69 - // config.toml equivalent) is used 70 - cargo.env("LLVM_CONFIG", build.llvm_config(target)); 71 - } 72 - cargo.arg("--features").arg(features) 73 - .arg("--manifest-path") 74 - .arg(build.src.join("src/libstd/Cargo.toml")); 75 - 76 - if let Some(target) = build.config.target_config.get(target) { 77 - if let Some(ref jemalloc) = target.jemalloc { 78 - cargo.env("JEMALLOC_OVERRIDE", jemalloc); 79 - } 80 - } 81 - if target.contains("musl") { 82 - if let Some(p) = build.musl_root(target) { 83 - cargo.env("MUSL_ROOT", p); 84 - } 85 - } 86 - 87 - build.run(&mut cargo); 88 - update_mtime(build, &libstd_stamp(build, &compiler, target)); 89 -} 90 - 91 -/// Link all libstd rlibs/dylibs into the sysroot location. 92 -/// 93 -/// Links those artifacts generated by `compiler` to a the `stage` compiler's 94 -/// sysroot for the specified `host` and `target`. 95 -/// 96 -/// Note that this assumes that `compiler` has already generated the libstd 97 -/// libraries for `target`, and this method will find them in the relevant 98 -/// output directory. 99 -pub fn std_link(build: &Build, 100 - compiler: &Compiler, 101 - target_compiler: &Compiler, 102 - target: &str) { 103 - println!("Copying stage{} std from stage{} ({} -> {} / {})", 104 - target_compiler.stage, 105 - compiler.stage, 106 - compiler.host, 107 - target_compiler.host, 108 - target); 109 - let libdir = build.sysroot_libdir(&target_compiler, target); 110 - let out_dir = build.cargo_out(&compiler, Mode::Libstd, target); 111 - 112 - t!(fs::create_dir_all(&libdir)); 113 - add_to_sysroot(&out_dir, &libdir); 114 - 115 - if target.contains("musl") && !target.contains("mips") { 116 - copy_musl_third_party_objects(build, target, &libdir); 117 - } 118 - 119 - if build.config.sanitizers && compiler.stage != 0 && target == "x86_64-apple-darwin" { 120 - // The sanitizers are only built in stage1 or above, so the dylibs will 121 - // be missing in stage0 and causes panic. See the `std()` function above 122 - // for reason why the sanitizers are not built in stage0. 123 - copy_apple_sanitizer_dylibs(&build.native_dir(target), "osx", &libdir); 124 - } 125 -} 126 - 127 -/// Copies the crt(1,i,n).o startup objects 128 -/// 129 -/// Only required for musl targets that statically link to libc 130 -fn copy_musl_third_party_objects(build: &Build, target: &str, into: &Path) { 131 - for &obj in &["crt1.o", "crti.o", "crtn.o"] { 132 - copy(&build.musl_root(target).unwrap().join("lib").join(obj), &into.join(obj)); 133 - } 134 -} 135 - 136 -fn copy_apple_sanitizer_dylibs(native_dir: &Path, platform: &str, into: &Path) { 137 - for &sanitizer in &["asan", "tsan"] { 138 - let filename = format!("libclang_rt.{}_{}_dynamic.dylib", sanitizer, platform); 139 - let mut src_path = native_dir.join(sanitizer); 140 - src_path.push("build"); 141 - src_path.push("lib"); 142 - src_path.push("darwin"); 143 - src_path.push(&filename); 144 - copy(&src_path, &into.join(filename)); 145 - } 146 -} 147 - 148 -/// Build and prepare startup objects like rsbegin.o and rsend.o 149 -/// 150 -/// These are primarily used on Windows right now for linking executables/dlls. 151 -/// They don't require any library support as they're just plain old object 152 -/// files, so we just use the nightly snapshot compiler to always build them (as 153 -/// no other compilers are guaranteed to be available). 154 -pub fn build_startup_objects(build: &Build, for_compiler: &Compiler, target: &str) { 155 - if !target.contains("pc-windows-gnu") { 156 - return 157 - } 158 - 159 - let compiler = Compiler::new(0, &build.config.build); 160 - let compiler_path = build.compiler_path(&compiler); 161 - let src_dir = &build.src.join("src/rtstartup"); 162 - let dst_dir = &build.native_dir(target).join("rtstartup"); 163 - let sysroot_dir = &build.sysroot_libdir(for_compiler, target); 164 - t!(fs::create_dir_all(dst_dir)); 165 - t!(fs::create_dir_all(sysroot_dir)); 166 - 167 - for file in &["rsbegin", "rsend"] { 168 - let src_file = &src_dir.join(file.to_string() + ".rs"); 169 - let dst_file = &dst_dir.join(file.to_string() + ".o"); 170 - if !up_to_date(src_file, dst_file) { 171 - let mut cmd = Command::new(&compiler_path); 172 - build.run(cmd.env("RUSTC_BOOTSTRAP", "1") 173 - .arg("--cfg").arg(format!("stage{}", compiler.stage)) 174 - .arg("--target").arg(target) 175 - .arg("--emit=obj") 176 - .arg("--out-dir").arg(dst_dir) 177 - .arg(src_file)); 178 - } 179 - 180 - copy(dst_file, &sysroot_dir.join(file.to_string() + ".o")); 181 - } 182 - 183 - for obj in ["crt2.o", "dllcrt2.o"].iter() { 184 - copy(&compiler_file(build.cc(target), obj), &sysroot_dir.join(obj)); 185 - } 186 -} 187 - 188 -/// Build libtest. 189 -/// 190 -/// This will build libtest and supporting libraries for a particular stage of 191 -/// the build using the `compiler` targeting the `target` architecture. The 192 -/// artifacts created will also be linked into the sysroot directory. 193 -pub fn test(build: &Build, target: &str, compiler: &Compiler) { 194 - println!("Building stage{} test artifacts ({} -> {})", compiler.stage, 195 - compiler.host, target); 196 - let out_dir = build.cargo_out(compiler, Mode::Libtest, target); 197 - build.clear_if_dirty(&out_dir, &libstd_stamp(build, compiler, target)); 198 - let mut cargo = build.cargo(compiler, Mode::Libtest, target, "build"); 199 - if let Ok(target) = env::var("MACOSX_STD_DEPLOYMENT_TARGET") { 200 - cargo.env("MACOSX_DEPLOYMENT_TARGET", target); 201 - } 202 - cargo.arg("--manifest-path") 203 - .arg(build.src.join("src/libtest/Cargo.toml")); 204 - build.run(&mut cargo); 205 - update_mtime(build, &libtest_stamp(build, compiler, target)); 206 -} 207 - 208 -/// Same as `std_link`, only for libtest 209 -pub fn test_link(build: &Build, 210 - compiler: &Compiler, 211 - target_compiler: &Compiler, 212 - target: &str) { 213 - println!("Copying stage{} test from stage{} ({} -> {} / {})", 214 - target_compiler.stage, 215 - compiler.stage, 216 - compiler.host, 217 - target_compiler.host, 218 - target); 219 - let libdir = build.sysroot_libdir(&target_compiler, target); 220 - let out_dir = build.cargo_out(&compiler, Mode::Libtest, target); 221 - add_to_sysroot(&out_dir, &libdir); 222 -} 223 - 224 -/// Build the compiler. 225 -/// 226 -/// This will build the compiler for a particular stage of the build using 227 -/// the `compiler` targeting the `target` architecture. The artifacts 228 -/// created will also be linked into the sysroot directory. 229 -pub fn rustc(build: &Build, target: &str, compiler: &Compiler) { 230 - println!("Building stage{} compiler artifacts ({} -> {})", 231 - compiler.stage, compiler.host, target); 232 - 233 - let out_dir = build.cargo_out(compiler, Mode::Librustc, target); 234 - build.clear_if_dirty(&out_dir, &libtest_stamp(build, compiler, target)); 235 - 236 - let mut cargo = build.cargo(compiler, Mode::Librustc, target, "build"); 237 - cargo.arg("--features").arg(build.rustc_features()) 238 - .arg("--manifest-path") 239 - .arg(build.src.join("src/rustc/Cargo.toml")); 240 - 241 - // Set some configuration variables picked up by build scripts and 242 - // the compiler alike 243 - cargo.env("CFG_RELEASE", build.rust_release()) 244 - .env("CFG_RELEASE_CHANNEL", &build.config.channel) 245 - .env("CFG_VERSION", build.rust_version()) 246 - .env("CFG_PREFIX", build.config.prefix.clone().unwrap_or(PathBuf::new())); 247 - 248 - if compiler.stage == 0 { 249 - cargo.env("CFG_LIBDIR_RELATIVE", "lib"); 250 - } else { 251 - let libdir_relative = build.config.libdir_relative.clone().unwrap_or(PathBuf::from("lib")); 252 - cargo.env("CFG_LIBDIR_RELATIVE", libdir_relative); 253 - } 254 - 255 - // If we're not building a compiler with debugging information then remove 256 - // these two env vars which would be set otherwise. 257 - if build.config.rust_debuginfo_only_std { 258 - cargo.env_remove("RUSTC_DEBUGINFO"); 259 - cargo.env_remove("RUSTC_DEBUGINFO_LINES"); 260 - } 261 - 262 - if let Some(ref ver_date) = build.rust_info.commit_date() { 263 - cargo.env("CFG_VER_DATE", ver_date); 264 - } 265 - if let Some(ref ver_hash) = build.rust_info.sha() { 266 - cargo.env("CFG_VER_HASH", ver_hash); 267 - } 268 - if !build.unstable_features() { 269 - cargo.env("CFG_DISABLE_UNSTABLE_FEATURES", "1"); 270 - } 271 - // Flag that rust llvm is in use 272 - if build.is_rust_llvm(target) { 273 - cargo.env("LLVM_RUSTLLVM", "1"); 274 - } 275 - cargo.env("LLVM_CONFIG", build.llvm_config(target)); 276 - let target_config = build.config.target_config.get(target); 277 - if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) { 278 - cargo.env("CFG_LLVM_ROOT", s); 279 - } 280 - // Building with a static libstdc++ is only supported on linux right now, 281 - // not for MSVC or macOS 282 - if build.config.llvm_static_stdcpp && 283 - !target.contains("windows") && 284 - !target.contains("apple") { 285 - cargo.env("LLVM_STATIC_STDCPP", 286 - compiler_file(build.cxx(target), "libstdc++.a")); 287 - } 288 - if build.config.llvm_link_shared { 289 - cargo.env("LLVM_LINK_SHARED", "1"); 290 - } 291 - if let Some(ref s) = build.config.rustc_default_linker { 292 - cargo.env("CFG_DEFAULT_LINKER", s); 293 - } 294 - if let Some(ref s) = build.config.rustc_default_ar { 295 - cargo.env("CFG_DEFAULT_AR", s); 296 - } 297 - build.run(&mut cargo); 298 - update_mtime(build, &librustc_stamp(build, compiler, target)); 299 -} 300 - 301 -/// Same as `std_link`, only for librustc 302 -pub fn rustc_link(build: &Build, 303 - compiler: &Compiler, 304 - target_compiler: &Compiler, 305 - target: &str) { 306 - println!("Copying stage{} rustc from stage{} ({} -> {} / {})", 307 - target_compiler.stage, 308 - compiler.stage, 309 - compiler.host, 310 - target_compiler.host, 311 - target); 312 - let libdir = build.sysroot_libdir(&target_compiler, target); 313 - let out_dir = build.cargo_out(&compiler, Mode::Librustc, target); 314 - add_to_sysroot(&out_dir, &libdir); 315 -} 316 - 317 -/// Cargo's output path for the standard library in a given stage, compiled 318 -/// by a particular compiler for the specified target. 319 -fn libstd_stamp(build: &Build, compiler: &Compiler, target: &str) -> PathBuf { 320 - build.cargo_out(compiler, Mode::Libstd, target).join(".libstd.stamp") 321 -} 322 - 323 -/// Cargo's output path for libtest in a given stage, compiled by a particular 324 -/// compiler for the specified target. 325 -fn libtest_stamp(build: &Build, compiler: &Compiler, target: &str) -> PathBuf { 326 - build.cargo_out(compiler, Mode::Libtest, target).join(".libtest.stamp") 327 -} 328 - 329 -/// Cargo's output path for librustc in a given stage, compiled by a particular 330 -/// compiler for the specified target. 331 -fn librustc_stamp(build: &Build, compiler: &Compiler, target: &str) -> PathBuf { 332 - build.cargo_out(compiler, Mode::Librustc, target).join(".librustc.stamp") 333 -} 334 - 335 -fn compiler_file(compiler: &Path, file: &str) -> PathBuf { 336 - let out = output(Command::new(compiler) 337 - .arg(format!("-print-file-name={}", file))); 338 - PathBuf::from(out.trim()) 339 -} 340 - 341 -pub fn create_sysroot(build: &Build, compiler: &Compiler) { 342 - let sysroot = build.sysroot(compiler); 343 - let _ = fs::remove_dir_all(&sysroot); 344 - t!(fs::create_dir_all(&sysroot)); 345 -} 346 - 347 -/// Prepare a new compiler from the artifacts in `stage` 348 -/// 349 -/// This will assemble a compiler in `build/$host/stage$stage`. The compiler 350 -/// must have been previously produced by the `stage - 1` build.config.build 351 -/// compiler. 352 -pub fn assemble_rustc(build: &Build, stage: u32, host: &str) { 353 - // nothing to do in stage0 354 - if stage == 0 { 355 - return 356 - } 357 - 358 - println!("Copying stage{} compiler ({})", stage, host); 359 - 360 - // The compiler that we're assembling 361 - let target_compiler = Compiler::new(stage, host); 362 - 363 - // The compiler that compiled the compiler we're assembling 364 - let build_compiler = Compiler::new(stage - 1, &build.config.build); 365 - 366 - // Link in all dylibs to the libdir 367 - let sysroot = build.sysroot(&target_compiler); 368 - let sysroot_libdir = sysroot.join(libdir(host)); 369 - t!(fs::create_dir_all(&sysroot_libdir)); 370 - let src_libdir = build.sysroot_libdir(&build_compiler, host); 371 - for f in t!(fs::read_dir(&src_libdir)).map(|f| t!(f)) { 372 - let filename = f.file_name().into_string().unwrap(); 373 - if is_dylib(&filename) { 374 - copy(&f.path(), &sysroot_libdir.join(&filename)); 375 - } 376 - } 377 - 378 - let out_dir = build.cargo_out(&build_compiler, Mode::Librustc, host); 379 - 380 - // Link the compiler binary itself into place 381 - let rustc = out_dir.join(exe("rustc", host)); 382 - let bindir = sysroot.join("bin"); 383 - t!(fs::create_dir_all(&bindir)); 384 - let compiler = build.compiler_path(&Compiler::new(stage, host)); 385 - let _ = fs::remove_file(&compiler); 386 - copy(&rustc, &compiler); 387 - 388 - // See if rustdoc exists to link it into place 389 - let rustdoc = exe("rustdoc", host); 390 - let rustdoc_src = out_dir.join(&rustdoc); 391 - let rustdoc_dst = bindir.join(&rustdoc); 392 - if fs::metadata(&rustdoc_src).is_ok() { 393 - let _ = fs::remove_file(&rustdoc_dst); 394 - copy(&rustdoc_src, &rustdoc_dst); 395 - } 396 -} 397 - 398 -/// Link some files into a rustc sysroot. 399 -/// 400 -/// For a particular stage this will link all of the contents of `out_dir` 401 -/// into the sysroot of the `host` compiler, assuming the artifacts are 402 -/// compiled for the specified `target`. 403 -fn add_to_sysroot(out_dir: &Path, sysroot_dst: &Path) { 404 - // Collect the set of all files in the dependencies directory, keyed 405 - // off the name of the library. We assume everything is of the form 406 - // `foo-<hash>.{rlib,so,...}`, and there could be multiple different 407 - // `<hash>` values for the same name (of old builds). 408 - let mut map = HashMap::new(); 409 - for file in t!(fs::read_dir(out_dir.join("deps"))).map(|f| t!(f)) { 410 - let filename = file.file_name().into_string().unwrap(); 411 - 412 - // We're only interested in linking rlibs + dylibs, other things like 413 - // unit tests don't get linked in 414 - if !filename.ends_with(".rlib") && 415 - !filename.ends_with(".lib") && 416 - !is_dylib(&filename) { 417 - continue 418 - } 419 - let file = file.path(); 420 - let dash = filename.find("-").unwrap(); 421 - let key = (filename[..dash].to_string(), 422 - file.extension().unwrap().to_owned()); 423 - map.entry(key).or_insert(Vec::new()) 424 - .push(file.clone()); 425 - } 426 - 427 - // For all hash values found, pick the most recent one to move into the 428 - // sysroot, that should be the one we just built. 429 - for (_, paths) in map { 430 - let (_, path) = paths.iter().map(|path| { 431 - (mtime(&path).seconds(), path) 432 - }).max().unwrap(); 433 - copy(&path, &sysroot_dst.join(path.file_name().unwrap())); 434 - } 435 -} 436 - 437 -/// Build a tool in `src/tools` 438 -/// 439 -/// This will build the specified tool with the specified `host` compiler in 440 -/// `stage` into the normal cargo output directory. 441 -pub fn maybe_clean_tools(build: &Build, stage: u32, target: &str, mode: Mode) { 442 - let compiler = Compiler::new(stage, &build.config.build); 443 - 444 - let stamp = match mode { 445 - Mode::Libstd => libstd_stamp(build, &compiler, target), 446 - Mode::Libtest => libtest_stamp(build, &compiler, target), 447 - Mode::Librustc => librustc_stamp(build, &compiler, target), 448 - _ => panic!(), 449 - }; 450 - let out_dir = build.cargo_out(&compiler, Mode::Tool, target); 451 - build.clear_if_dirty(&out_dir, &stamp); 452 -} 453 - 454 -/// Build a tool in `src/tools` 455 -/// 456 -/// This will build the specified tool with the specified `host` compiler in 457 -/// `stage` into the normal cargo output directory. 458 -pub fn tool(build: &Build, stage: u32, target: &str, tool: &str) { 459 - println!("Building stage{} tool {} ({})", stage, tool, target); 460 - 461 - let compiler = Compiler::new(stage, &build.config.build); 462 - 463 - let mut cargo = build.cargo(&compiler, Mode::Tool, target, "build"); 464 - let dir = build.src.join("src/tools").join(tool); 465 - cargo.arg("--manifest-path").arg(dir.join("Cargo.toml")); 466 - 467 - // We don't want to build tools dynamically as they'll be running across 468 - // stages and such and it's just easier if they're not dynamically linked. 469 - cargo.env("RUSTC_NO_PREFER_DYNAMIC", "1"); 470 - 471 - if let Some(dir) = build.openssl_install_dir(target) { 472 - cargo.env("OPENSSL_STATIC", "1"); 473 - cargo.env("OPENSSL_DIR", dir); 474 - cargo.env("LIBZ_SYS_STATIC", "1"); 475 - } 476 - 477 - cargo.env("CFG_RELEASE_CHANNEL", &build.config.channel); 478 - 479 - let info = GitInfo::new(&dir); 480 - if let Some(sha) = info.sha() { 481 - cargo.env("CFG_COMMIT_HASH", sha); 482 - } 483 - if let Some(sha_short) = info.sha_short() { 484 - cargo.env("CFG_SHORT_COMMIT_HASH", sha_short); 485 - } 486 - if let Some(date) = info.commit_date() { 487 - cargo.env("CFG_COMMIT_DATE", date); 488 - } 489 - 490 - build.run(&mut cargo); 491 -} 492 - 493 -/// Updates the mtime of a stamp file if necessary, only changing it if it's 494 -/// older than some other library file in the same directory. 495 -/// 496 -/// We don't know what file Cargo is going to output (because there's a hash in 497 -/// the file name) but we know where it's going to put it. We use this helper to 498 -/// detect changes to that output file by looking at the modification time for 499 -/// all files in a directory and updating the stamp if any are newer. 500 -/// 501 -/// Note that we only consider Rust libraries as that's what we're interested in 502 -/// propagating changes from. Files like executables are tracked elsewhere. 503 -fn update_mtime(build: &Build, path: &Path) { 504 - let entries = match path.parent().unwrap().join("deps").read_dir() { 505 - Ok(entries) => entries, 506 - Err(_) => return, 507 - }; 508 - let files = entries.map(|e| t!(e)).filter(|e| t!(e.file_type()).is_file()); 509 - let files = files.filter(|e| { 510 - let filename = e.file_name(); 511 - let filename = filename.to_str().unwrap(); 512 - filename.ends_with(".rlib") || 513 - filename.ends_with(".lib") || 514 - is_dylib(&filename) 515 - }); 516 - let max = files.max_by_key(|entry| { 517 - let meta = t!(entry.metadata()); 518 - FileTime::from_last_modification_time(&meta) 519 - }); 520 - let max = match max { 521 - Some(max) => max, 522 - None => return, 523 - }; 524 - 525 - if mtime(&max.path()) > mtime(path) { 526 - build.verbose(&format!("updating {:?} as {:?} changed", path, max.path())); 527 - t!(File::create(path)); 528 - } 529 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/config.rs version [b044672c89].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Serialized configuration of a build. 12 -//! 13 -//! This module implements parsing `config.mk` and `config.toml` configuration 14 -//! files to tweak how the build runs. 15 - 16 -use std::collections::HashMap; 17 -use std::env; 18 -use std::fs::File; 19 -use std::io::prelude::*; 20 -use std::path::PathBuf; 21 -use std::process; 22 - 23 -use num_cpus; 24 -use rustc_serialize::Decodable; 25 -use toml::{Parser, Decoder, Value}; 26 -use util::{exe, push_exe_path}; 27 - 28 -/// Global configuration for the entire build and/or bootstrap. 29 -/// 30 -/// This structure is derived from a combination of both `config.toml` and 31 -/// `config.mk`. As of the time of this writing it's unlikely that `config.toml` 32 -/// is used all that much, so this is primarily filled out by `config.mk` which 33 -/// is generated from `./configure`. 34 -/// 35 -/// Note that this structure is not decoded directly into, but rather it is 36 -/// filled out from the decoded forms of the structs below. For documentation 37 -/// each field, see the corresponding fields in 38 -/// `src/bootstrap/config.toml.example`. 39 -#[derive(Default)] 40 -pub struct Config { 41 - pub ccache: Option<String>, 42 - pub ninja: bool, 43 - pub verbose: usize, 44 - pub submodules: bool, 45 - pub compiler_docs: bool, 46 - pub docs: bool, 47 - pub locked_deps: bool, 48 - pub vendor: bool, 49 - pub target_config: HashMap<String, Target>, 50 - pub full_bootstrap: bool, 51 - pub extended: bool, 52 - pub sanitizers: bool, 53 - 54 - // llvm codegen options 55 - pub llvm_assertions: bool, 56 - pub llvm_optimize: bool, 57 - pub llvm_release_debuginfo: bool, 58 - pub llvm_version_check: bool, 59 - pub llvm_static_stdcpp: bool, 60 - pub llvm_link_shared: bool, 61 - pub llvm_targets: Option<String>, 62 - pub llvm_link_jobs: Option<u32>, 63 - pub llvm_clean_rebuild: bool, 64 - 65 - // rust codegen options 66 - pub rust_optimize: bool, 67 - pub rust_codegen_units: u32, 68 - pub rust_debug_assertions: bool, 69 - pub rust_debuginfo: bool, 70 - pub rust_debuginfo_lines: bool, 71 - pub rust_debuginfo_only_std: bool, 72 - pub rust_rpath: bool, 73 - pub rustc_default_linker: Option<String>, 74 - pub rustc_default_ar: Option<String>, 75 - pub rust_optimize_tests: bool, 76 - pub rust_debuginfo_tests: bool, 77 - pub rust_dist_src: bool, 78 - 79 - pub build: String, 80 - pub host: Vec<String>, 81 - pub target: Vec<String>, 82 - pub rustc: Option<PathBuf>, 83 - pub cargo: Option<PathBuf>, 84 - pub local_rebuild: bool, 85 - 86 - // dist misc 87 - pub dist_sign_folder: Option<PathBuf>, 88 - pub dist_upload_addr: Option<String>, 89 - pub dist_gpg_password_file: Option<PathBuf>, 90 - 91 - // libstd features 92 - pub debug_jemalloc: bool, 93 - pub use_jemalloc: bool, 94 - pub backtrace: bool, // support for RUST_BACKTRACE 95 - 96 - // misc 97 - pub channel: String, 98 - pub quiet_tests: bool, 99 - // Fallback musl-root for all targets 100 - pub musl_root: Option<PathBuf>, 101 - pub prefix: Option<PathBuf>, 102 - pub docdir: Option<PathBuf>, 103 - pub libdir: Option<PathBuf>, 104 - pub libdir_relative: Option<PathBuf>, 105 - pub mandir: Option<PathBuf>, 106 - pub codegen_tests: bool, 107 - pub nodejs: Option<PathBuf>, 108 - pub gdb: Option<PathBuf>, 109 - pub python: Option<PathBuf>, 110 - pub configure_args: Vec<String>, 111 - pub openssl_static: bool, 112 -} 113 - 114 -/// Per-target configuration stored in the global configuration structure. 115 -#[derive(Default)] 116 -pub struct Target { 117 - pub llvm_config: Option<PathBuf>, 118 - pub jemalloc: Option<PathBuf>, 119 - pub cc: Option<PathBuf>, 120 - pub cxx: Option<PathBuf>, 121 - pub ndk: Option<PathBuf>, 122 - pub musl_root: Option<PathBuf>, 123 - pub qemu_rootfs: Option<PathBuf>, 124 -} 125 - 126 -/// Structure of the `config.toml` file that configuration is read from. 127 -/// 128 -/// This structure uses `Decodable` to automatically decode a TOML configuration 129 -/// file into this format, and then this is traversed and written into the above 130 -/// `Config` structure. 131 -#[derive(RustcDecodable, Default)] 132 -struct TomlConfig { 133 - build: Option<Build>, 134 - install: Option<Install>, 135 - llvm: Option<Llvm>, 136 - rust: Option<Rust>, 137 - target: Option<HashMap<String, TomlTarget>>, 138 - dist: Option<Dist>, 139 -} 140 - 141 -/// TOML representation of various global build decisions. 142 -#[derive(RustcDecodable, Default, Clone)] 143 -struct Build { 144 - build: Option<String>, 145 - host: Vec<String>, 146 - target: Vec<String>, 147 - cargo: Option<String>, 148 - rustc: Option<String>, 149 - compiler_docs: Option<bool>, 150 - docs: Option<bool>, 151 - submodules: Option<bool>, 152 - gdb: Option<String>, 153 - locked_deps: Option<bool>, 154 - vendor: Option<bool>, 155 - nodejs: Option<String>, 156 - python: Option<String>, 157 - full_bootstrap: Option<bool>, 158 - extended: Option<bool>, 159 - verbose: Option<usize>, 160 - sanitizers: Option<bool>, 161 - openssl_static: Option<bool>, 162 -} 163 - 164 -/// TOML representation of various global install decisions. 165 -#[derive(RustcDecodable, Default, Clone)] 166 -struct Install { 167 - prefix: Option<String>, 168 - mandir: Option<String>, 169 - docdir: Option<String>, 170 - libdir: Option<String>, 171 -} 172 - 173 -/// TOML representation of how the LLVM build is configured. 174 -#[derive(RustcDecodable, Default)] 175 -struct Llvm { 176 - ccache: Option<StringOrBool>, 177 - ninja: Option<bool>, 178 - assertions: Option<bool>, 179 - optimize: Option<bool>, 180 - release_debuginfo: Option<bool>, 181 - version_check: Option<bool>, 182 - static_libstdcpp: Option<bool>, 183 - targets: Option<String>, 184 - link_jobs: Option<u32>, 185 - clean_rebuild: Option<bool>, 186 -} 187 - 188 -#[derive(RustcDecodable, Default, Clone)] 189 -struct Dist { 190 - sign_folder: Option<String>, 191 - gpg_password_file: Option<String>, 192 - upload_addr: Option<String>, 193 - src_tarball: Option<bool>, 194 -} 195 - 196 -#[derive(RustcDecodable)] 197 -enum StringOrBool { 198 - String(String), 199 - Bool(bool), 200 -} 201 - 202 -impl Default for StringOrBool { 203 - fn default() -> StringOrBool { 204 - StringOrBool::Bool(false) 205 - } 206 -} 207 - 208 -/// TOML representation of how the Rust build is configured. 209 -#[derive(RustcDecodable, Default)] 210 -struct Rust { 211 - optimize: Option<bool>, 212 - codegen_units: Option<u32>, 213 - debug_assertions: Option<bool>, 214 - debuginfo: Option<bool>, 215 - debuginfo_lines: Option<bool>, 216 - debuginfo_only_std: Option<bool>, 217 - debug_jemalloc: Option<bool>, 218 - use_jemalloc: Option<bool>, 219 - backtrace: Option<bool>, 220 - default_linker: Option<String>, 221 - default_ar: Option<String>, 222 - channel: Option<String>, 223 - musl_root: Option<String>, 224 - rpath: Option<bool>, 225 - optimize_tests: Option<bool>, 226 - debuginfo_tests: Option<bool>, 227 - codegen_tests: Option<bool>, 228 -} 229 - 230 -/// TOML representation of how each build target is configured. 231 -#[derive(RustcDecodable, Default)] 232 -struct TomlTarget { 233 - llvm_config: Option<String>, 234 - jemalloc: Option<String>, 235 - cc: Option<String>, 236 - cxx: Option<String>, 237 - android_ndk: Option<String>, 238 - musl_root: Option<String>, 239 - qemu_rootfs: Option<String>, 240 -} 241 - 242 -impl Config { 243 - pub fn parse(build: &str, file: Option<PathBuf>) -> Config { 244 - let mut config = Config::default(); 245 - config.llvm_optimize = true; 246 - config.use_jemalloc = true; 247 - config.backtrace = true; 248 - config.rust_optimize = true; 249 - config.rust_optimize_tests = true; 250 - config.submodules = true; 251 - config.docs = true; 252 - config.rust_rpath = true; 253 - config.rust_codegen_units = 1; 254 - config.build = build.to_string(); 255 - config.channel = "dev".to_string(); 256 - config.codegen_tests = true; 257 - config.rust_dist_src = true; 258 - 259 - let toml = file.map(|file| { 260 - let mut f = t!(File::open(&file)); 261 - let mut toml = String::new(); 262 - t!(f.read_to_string(&mut toml)); 263 - let mut p = Parser::new(&toml); 264 - let table = match p.parse() { 265 - Some(table) => table, 266 - None => { 267 - println!("failed to parse TOML configuration '{}':", file.to_str().unwrap()); 268 - for err in p.errors.iter() { 269 - let (loline, locol) = p.to_linecol(err.lo); 270 - let (hiline, hicol) = p.to_linecol(err.hi); 271 - println!("{}:{}-{}:{}: {}", loline, locol, hiline, 272 - hicol, err.desc); 273 - } 274 - process::exit(2); 275 - } 276 - }; 277 - let mut d = Decoder::new(Value::Table(table)); 278 - match Decodable::decode(&mut d) { 279 - Ok(cfg) => cfg, 280 - Err(e) => { 281 - println!("failed to decode TOML: {}", e); 282 - process::exit(2); 283 - } 284 - } 285 - }).unwrap_or_else(|| TomlConfig::default()); 286 - 287 - let build = toml.build.clone().unwrap_or(Build::default()); 288 - set(&mut config.build, build.build.clone()); 289 - config.host.push(config.build.clone()); 290 - for host in build.host.iter() { 291 - if !config.host.contains(host) { 292 - config.host.push(host.clone()); 293 - } 294 - } 295 - for target in config.host.iter().chain(&build.target) { 296 - if !config.target.contains(target) { 297 - config.target.push(target.clone()); 298 - } 299 - } 300 - config.rustc = build.rustc.map(PathBuf::from); 301 - config.cargo = build.cargo.map(PathBuf::from); 302 - config.nodejs = build.nodejs.map(PathBuf::from); 303 - config.gdb = build.gdb.map(PathBuf::from); 304 - config.python = build.python.map(PathBuf::from); 305 - set(&mut config.compiler_docs, build.compiler_docs); 306 - set(&mut config.docs, build.docs); 307 - set(&mut config.submodules, build.submodules); 308 - set(&mut config.locked_deps, build.locked_deps); 309 - set(&mut config.vendor, build.vendor); 310 - set(&mut config.full_bootstrap, build.full_bootstrap); 311 - set(&mut config.extended, build.extended); 312 - set(&mut config.verbose, build.verbose); 313 - set(&mut config.sanitizers, build.sanitizers); 314 - set(&mut config.openssl_static, build.openssl_static); 315 - 316 - if let Some(ref install) = toml.install { 317 - config.prefix = install.prefix.clone().map(PathBuf::from); 318 - config.mandir = install.mandir.clone().map(PathBuf::from); 319 - config.docdir = install.docdir.clone().map(PathBuf::from); 320 - config.libdir = install.libdir.clone().map(PathBuf::from); 321 - } 322 - 323 - if let Some(ref llvm) = toml.llvm { 324 - match llvm.ccache { 325 - Some(StringOrBool::String(ref s)) => { 326 - config.ccache = Some(s.to_string()) 327 - } 328 - Some(StringOrBool::Bool(true)) => { 329 - config.ccache = Some("ccache".to_string()); 330 - } 331 - Some(StringOrBool::Bool(false)) | None => {} 332 - } 333 - set(&mut config.ninja, llvm.ninja); 334 - set(&mut config.llvm_assertions, llvm.assertions); 335 - set(&mut config.llvm_optimize, llvm.optimize); 336 - set(&mut config.llvm_release_debuginfo, llvm.release_debuginfo); 337 - set(&mut config.llvm_version_check, llvm.version_check); 338 - set(&mut config.llvm_static_stdcpp, llvm.static_libstdcpp); 339 - set(&mut config.llvm_clean_rebuild, llvm.clean_rebuild); 340 - config.llvm_targets = llvm.targets.clone(); 341 - config.llvm_link_jobs = llvm.link_jobs; 342 - } 343 - 344 - if let Some(ref rust) = toml.rust { 345 - set(&mut config.rust_debug_assertions, rust.debug_assertions); 346 - set(&mut config.rust_debuginfo, rust.debuginfo); 347 - set(&mut config.rust_debuginfo_lines, rust.debuginfo_lines); 348 - set(&mut config.rust_debuginfo_only_std, rust.debuginfo_only_std); 349 - set(&mut config.rust_optimize, rust.optimize); 350 - set(&mut config.rust_optimize_tests, rust.optimize_tests); 351 - set(&mut config.rust_debuginfo_tests, rust.debuginfo_tests); 352 - set(&mut config.codegen_tests, rust.codegen_tests); 353 - set(&mut config.rust_rpath, rust.rpath); 354 - set(&mut config.debug_jemalloc, rust.debug_jemalloc); 355 - set(&mut config.use_jemalloc, rust.use_jemalloc); 356 - set(&mut config.backtrace, rust.backtrace); 357 - set(&mut config.channel, rust.channel.clone()); 358 - config.rustc_default_linker = rust.default_linker.clone(); 359 - config.rustc_default_ar = rust.default_ar.clone(); 360 - config.musl_root = rust.musl_root.clone().map(PathBuf::from); 361 - 362 - match rust.codegen_units { 363 - Some(0) => config.rust_codegen_units = num_cpus::get() as u32, 364 - Some(n) => config.rust_codegen_units = n, 365 - None => {} 366 - } 367 - } 368 - 369 - if let Some(ref t) = toml.target { 370 - for (triple, cfg) in t { 371 - let mut target = Target::default(); 372 - 373 - if let Some(ref s) = cfg.llvm_config { 374 - target.llvm_config = Some(env::current_dir().unwrap().join(s)); 375 - } 376 - if let Some(ref s) = cfg.jemalloc { 377 - target.jemalloc = Some(env::current_dir().unwrap().join(s)); 378 - } 379 - if let Some(ref s) = cfg.android_ndk { 380 - target.ndk = Some(env::current_dir().unwrap().join(s)); 381 - } 382 - target.cxx = cfg.cxx.clone().map(PathBuf::from); 383 - target.cc = cfg.cc.clone().map(PathBuf::from); 384 - target.musl_root = cfg.musl_root.clone().map(PathBuf::from); 385 - target.qemu_rootfs = cfg.qemu_rootfs.clone().map(PathBuf::from); 386 - 387 - config.target_config.insert(triple.clone(), target); 388 - } 389 - } 390 - 391 - if let Some(ref t) = toml.dist { 392 - config.dist_sign_folder = t.sign_folder.clone().map(PathBuf::from); 393 - config.dist_gpg_password_file = t.gpg_password_file.clone().map(PathBuf::from); 394 - config.dist_upload_addr = t.upload_addr.clone(); 395 - set(&mut config.rust_dist_src, t.src_tarball); 396 - } 397 - 398 - return config 399 - } 400 - 401 - /// "Temporary" routine to parse `config.mk` into this configuration. 402 - /// 403 - /// While we still have `./configure` this implements the ability to decode 404 - /// that configuration into this. This isn't exactly a full-blown makefile 405 - /// parser, but hey it gets the job done! 406 - pub fn update_with_config_mk(&mut self) { 407 - let mut config = String::new(); 408 - File::open("config.mk").unwrap().read_to_string(&mut config).unwrap(); 409 - for line in config.lines() { 410 - let mut parts = line.splitn(2, ":=").map(|s| s.trim()); 411 - let key = parts.next().unwrap(); 412 - let value = match parts.next() { 413 - Some(n) if n.starts_with('\"') => &n[1..n.len() - 1], 414 - Some(n) => n, 415 - None => continue 416 - }; 417 - 418 - macro_rules! check { 419 - ($(($name:expr, $val:expr),)*) => { 420 - if value == "1" { 421 - $( 422 - if key == concat!("CFG_ENABLE_", $name) { 423 - $val = true; 424 - continue 425 - } 426 - if key == concat!("CFG_DISABLE_", $name) { 427 - $val = false; 428 - continue 429 - } 430 - )* 431 - } 432 - } 433 - } 434 - 435 - check! { 436 - ("MANAGE_SUBMODULES", self.submodules), 437 - ("COMPILER_DOCS", self.compiler_docs), 438 - ("DOCS", self.docs), 439 - ("LLVM_ASSERTIONS", self.llvm_assertions), 440 - ("LLVM_RELEASE_DEBUGINFO", self.llvm_release_debuginfo), 441 - ("OPTIMIZE_LLVM", self.llvm_optimize), 442 - ("LLVM_VERSION_CHECK", self.llvm_version_check), 443 - ("LLVM_STATIC_STDCPP", self.llvm_static_stdcpp), 444 - ("LLVM_LINK_SHARED", self.llvm_link_shared), 445 - ("LLVM_CLEAN_REBUILD", self.llvm_clean_rebuild), 446 - ("OPTIMIZE", self.rust_optimize), 447 - ("DEBUG_ASSERTIONS", self.rust_debug_assertions), 448 - ("DEBUGINFO", self.rust_debuginfo), 449 - ("DEBUGINFO_LINES", self.rust_debuginfo_lines), 450 - ("DEBUGINFO_ONLY_STD", self.rust_debuginfo_only_std), 451 - ("JEMALLOC", self.use_jemalloc), 452 - ("DEBUG_JEMALLOC", self.debug_jemalloc), 453 - ("RPATH", self.rust_rpath), 454 - ("OPTIMIZE_TESTS", self.rust_optimize_tests), 455 - ("DEBUGINFO_TESTS", self.rust_debuginfo_tests), 456 - ("QUIET_TESTS", self.quiet_tests), 457 - ("LOCAL_REBUILD", self.local_rebuild), 458 - ("NINJA", self.ninja), 459 - ("CODEGEN_TESTS", self.codegen_tests), 460 - ("LOCKED_DEPS", self.locked_deps), 461 - ("VENDOR", self.vendor), 462 - ("FULL_BOOTSTRAP", self.full_bootstrap), 463 - ("EXTENDED", self.extended), 464 - ("SANITIZERS", self.sanitizers), 465 - ("DIST_SRC", self.rust_dist_src), 466 - ("CARGO_OPENSSL_STATIC", self.openssl_static), 467 - } 468 - 469 - match key { 470 - "CFG_BUILD" if value.len() > 0 => self.build = value.to_string(), 471 - "CFG_HOST" if value.len() > 0 => { 472 - self.host.extend(value.split(" ").map(|s| s.to_string())); 473 - 474 - } 475 - "CFG_TARGET" if value.len() > 0 => { 476 - self.target.extend(value.split(" ").map(|s| s.to_string())); 477 - } 478 - "CFG_MUSL_ROOT" if value.len() > 0 => { 479 - self.musl_root = Some(parse_configure_path(value)); 480 - } 481 - "CFG_MUSL_ROOT_X86_64" if value.len() > 0 => { 482 - let target = "x86_64-unknown-linux-musl".to_string(); 483 - let target = self.target_config.entry(target) 484 - .or_insert(Target::default()); 485 - target.musl_root = Some(parse_configure_path(value)); 486 - } 487 - "CFG_MUSL_ROOT_I686" if value.len() > 0 => { 488 - let target = "i686-unknown-linux-musl".to_string(); 489 - let target = self.target_config.entry(target) 490 - .or_insert(Target::default()); 491 - target.musl_root = Some(parse_configure_path(value)); 492 - } 493 - "CFG_MUSL_ROOT_ARM" if value.len() > 0 => { 494 - let target = "arm-unknown-linux-musleabi".to_string(); 495 - let target = self.target_config.entry(target) 496 - .or_insert(Target::default()); 497 - target.musl_root = Some(parse_configure_path(value)); 498 - } 499 - "CFG_MUSL_ROOT_ARMHF" if value.len() > 0 => { 500 - let target = "arm-unknown-linux-musleabihf".to_string(); 501 - let target = self.target_config.entry(target) 502 - .or_insert(Target::default()); 503 - target.musl_root = Some(parse_configure_path(value)); 504 - } 505 - "CFG_MUSL_ROOT_ARMV7" if value.len() > 0 => { 506 - let target = "armv7-unknown-linux-musleabihf".to_string(); 507 - let target = self.target_config.entry(target) 508 - .or_insert(Target::default()); 509 - target.musl_root = Some(parse_configure_path(value)); 510 - } 511 - "CFG_DEFAULT_AR" if value.len() > 0 => { 512 - self.rustc_default_ar = Some(value.to_string()); 513 - } 514 - "CFG_DEFAULT_LINKER" if value.len() > 0 => { 515 - self.rustc_default_linker = Some(value.to_string()); 516 - } 517 - "CFG_GDB" if value.len() > 0 => { 518 - self.gdb = Some(parse_configure_path(value)); 519 - } 520 - "CFG_RELEASE_CHANNEL" => { 521 - self.channel = value.to_string(); 522 - } 523 - "CFG_PREFIX" => { 524 - self.prefix = Some(PathBuf::from(value)); 525 - } 526 - "CFG_DOCDIR" => { 527 - self.docdir = Some(PathBuf::from(value)); 528 - } 529 - "CFG_LIBDIR" => { 530 - self.libdir = Some(PathBuf::from(value)); 531 - } 532 - "CFG_LIBDIR_RELATIVE" => { 533 - self.libdir_relative = Some(PathBuf::from(value)); 534 - } 535 - "CFG_MANDIR" => { 536 - self.mandir = Some(PathBuf::from(value)); 537 - } 538 - "CFG_LLVM_ROOT" if value.len() > 0 => { 539 - let target = self.target_config.entry(self.build.clone()) 540 - .or_insert(Target::default()); 541 - let root = parse_configure_path(value); 542 - target.llvm_config = Some(push_exe_path(root, &["bin", "llvm-config"])); 543 - } 544 - "CFG_JEMALLOC_ROOT" if value.len() > 0 => { 545 - let target = self.target_config.entry(self.build.clone()) 546 - .or_insert(Target::default()); 547 - target.jemalloc = Some(parse_configure_path(value).join("libjemalloc_pic.a")); 548 - } 549 - "CFG_ARM_LINUX_ANDROIDEABI_NDK" if value.len() > 0 => { 550 - let target = "arm-linux-androideabi".to_string(); 551 - let target = self.target_config.entry(target) 552 - .or_insert(Target::default()); 553 - target.ndk = Some(parse_configure_path(value)); 554 - } 555 - "CFG_ARMV7_LINUX_ANDROIDEABI_NDK" if value.len() > 0 => { 556 - let target = "armv7-linux-androideabi".to_string(); 557 - let target = self.target_config.entry(target) 558 - .or_insert(Target::default()); 559 - target.ndk = Some(parse_configure_path(value)); 560 - } 561 - "CFG_I686_LINUX_ANDROID_NDK" if value.len() > 0 => { 562 - let target = "i686-linux-android".to_string(); 563 - let target = self.target_config.entry(target) 564 - .or_insert(Target::default()); 565 - target.ndk = Some(parse_configure_path(value)); 566 - } 567 - "CFG_AARCH64_LINUX_ANDROID_NDK" if value.len() > 0 => { 568 - let target = "aarch64-linux-android".to_string(); 569 - let target = self.target_config.entry(target) 570 - .or_insert(Target::default()); 571 - target.ndk = Some(parse_configure_path(value)); 572 - } 573 - "CFG_X86_64_LINUX_ANDROID_NDK" if value.len() > 0 => { 574 - let target = "x86_64-linux-android".to_string(); 575 - let target = self.target_config.entry(target) 576 - .or_insert(Target::default()); 577 - target.ndk = Some(parse_configure_path(value)); 578 - } 579 - "CFG_LOCAL_RUST_ROOT" if value.len() > 0 => { 580 - let path = parse_configure_path(value); 581 - self.rustc = Some(push_exe_path(path.clone(), &["bin", "rustc"])); 582 - self.cargo = Some(push_exe_path(path, &["bin", "cargo"])); 583 - } 584 - "CFG_PYTHON" if value.len() > 0 => { 585 - let path = parse_configure_path(value); 586 - self.python = Some(path); 587 - } 588 - "CFG_ENABLE_CCACHE" if value == "1" => { 589 - self.ccache = Some(exe("ccache", &self.build)); 590 - } 591 - "CFG_ENABLE_SCCACHE" if value == "1" => { 592 - self.ccache = Some(exe("sccache", &self.build)); 593 - } 594 - "CFG_CONFIGURE_ARGS" if value.len() > 0 => { 595 - self.configure_args = value.split_whitespace() 596 - .map(|s| s.to_string()) 597 - .collect(); 598 - } 599 - "CFG_QEMU_ARMHF_ROOTFS" if value.len() > 0 => { 600 - let target = "arm-unknown-linux-gnueabihf".to_string(); 601 - let target = self.target_config.entry(target) 602 - .or_insert(Target::default()); 603 - target.qemu_rootfs = Some(parse_configure_path(value)); 604 - } 605 - _ => {} 606 - } 607 - } 608 - } 609 - 610 - pub fn verbose(&self) -> bool { 611 - self.verbose > 0 612 - } 613 - 614 - pub fn very_verbose(&self) -> bool { 615 - self.verbose > 1 616 - } 617 -} 618 - 619 -#[cfg(not(windows))] 620 -fn parse_configure_path(path: &str) -> PathBuf { 621 - path.into() 622 -} 623 - 624 -#[cfg(windows)] 625 -fn parse_configure_path(path: &str) -> PathBuf { 626 - // on windows, configure produces unix style paths e.g. /c/some/path but we 627 - // only want real windows paths 628 - 629 - use std::process::Command; 630 - use build_helper; 631 - 632 - // '/' is invalid in windows paths, so we can detect unix paths by the presence of it 633 - if !path.contains('/') { 634 - return path.into(); 635 - } 636 - 637 - let win_path = build_helper::output(Command::new("cygpath").arg("-w").arg(path)); 638 - let win_path = win_path.trim(); 639 - 640 - win_path.into() 641 -} 642 - 643 -fn set<T>(field: &mut T, val: Option<T>) { 644 - if let Some(v) = val { 645 - *field = v; 646 - } 647 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/config.toml.example version [224f5f6af9].
1 -# Sample TOML configuration file for building Rust. 2 -# 3 -# To configure rustbuild, copy this file to the directory from which you will be 4 -# running the build, and name it config.toml. 5 -# 6 -# All options are commented out by default in this file, and they're commented 7 -# out with their default values. The build system by default looks for 8 -# `config.toml` in the current directory of a build for build configuration, but 9 -# a custom configuration file can also be specified with `--config` to the build 10 -# system. 11 - 12 -# ============================================================================= 13 -# Tweaking how LLVM is compiled 14 -# ============================================================================= 15 -[llvm] 16 - 17 -# Indicates whether the LLVM build is a Release or Debug build 18 -#optimize = true 19 - 20 -# Indicates whether an LLVM Release build should include debug info 21 -#release-debuginfo = false 22 - 23 -# Indicates whether the LLVM assertions are enabled or not 24 -#assertions = false 25 - 26 -# Indicates whether ccache is used when building LLVM 27 -#ccache = false 28 -# or alternatively ... 29 -#ccache = "/path/to/ccache" 30 - 31 -# If an external LLVM root is specified, we automatically check the version by 32 -# default to make sure it's within the range that we're expecting, but setting 33 -# this flag will indicate that this version check should not be done. 34 -#version-check = false 35 - 36 -# Link libstdc++ statically into the librustc_llvm instead of relying on a 37 -# dynamic version to be available. 38 -#static-libstdcpp = false 39 - 40 -# Tell the LLVM build system to use Ninja instead of the platform default for 41 -# the generated build system. This can sometimes be faster than make, for 42 -# example. 43 -#ninja = false 44 - 45 -# LLVM targets to build support for. 46 -# Note: this is NOT related to Rust compilation targets. However, as Rust is 47 -# dependent on LLVM for code generation, turning targets off here WILL lead to 48 -# the resulting rustc being unable to compile for the disabled architectures. 49 -# Also worth pointing out is that, in case support for new targets are added to 50 -# LLVM, enabling them here doesn't mean Rust is automatically gaining said 51 -# support. You'll need to write a target specification at least, and most 52 -# likely, teach rustc about the C ABI of the target. Get in touch with the 53 -# Rust team and file an issue if you need assistance in porting! 54 -#targets = "X86;ARM;AArch64;Mips;PowerPC;SystemZ;JSBackend;MSP430;Sparc;NVPTX;Hexagon" 55 - 56 -# Cap the number of parallel linker invocations when compiling LLVM. 57 -# This can be useful when building LLVM with debug info, which significantly 58 -# increases the size of binaries and consequently the memory required by 59 -# each linker process. 60 -# If absent or 0, linker invocations are treated like any other job and 61 -# controlled by rustbuild's -j parameter. 62 -#link-jobs = 0 63 - 64 -# Delete LLVM build directory on LLVM rebuild. 65 -# This option defaults to `false` for local development, but CI may want to 66 -# always perform clean full builds (possibly accelerated by (s)ccache). 67 -#clean-rebuild = false 68 - 69 -# ============================================================================= 70 -# General build configuration options 71 -# ============================================================================= 72 -[build] 73 - 74 -# Build triple for the original snapshot compiler. This must be a compiler that 75 -# nightlies are already produced for. The current platform must be able to run 76 -# binaries of this build triple and the nightly will be used to bootstrap the 77 -# first compiler. 78 -#build = "x86_64-unknown-linux-gnu" # defaults to your host platform 79 - 80 -# In addition to the build triple, other triples to produce full compiler 81 -# toolchains for. Each of these triples will be bootstrapped from the build 82 -# triple and then will continue to bootstrap themselves. This platform must 83 -# currently be able to run all of the triples provided here. 84 -#host = ["x86_64-unknown-linux-gnu"] # defaults to just the build triple 85 - 86 -# In addition to all host triples, other triples to produce the standard library 87 -# for. Each host triple will be used to produce a copy of the standard library 88 -# for each target triple. 89 -#target = ["x86_64-unknown-linux-gnu"] # defaults to just the build triple 90 - 91 -# Instead of downloading the src/stage0.txt version of Cargo specified, use 92 -# this Cargo binary instead to build all Rust code 93 -#cargo = "/path/to/bin/cargo" 94 - 95 -# Instead of downloading the src/stage0.txt version of the compiler 96 -# specified, use this rustc binary instead as the stage0 snapshot compiler. 97 -#rustc = "/path/to/bin/rustc" 98 - 99 -# Flag to specify whether any documentation is built. If false, rustdoc and 100 -# friends will still be compiled but they will not be used to generate any 101 -# documentation. 102 -#docs = true 103 - 104 -# Indicate whether the compiler should be documented in addition to the standard 105 -# library and facade crates. 106 -#compiler-docs = false 107 - 108 -# Indicate whether submodules are managed and updated automatically. 109 -#submodules = true 110 - 111 -# The path to (or name of) the GDB executable to use. This is only used for 112 -# executing the debuginfo test suite. 113 -#gdb = "gdb" 114 - 115 -# The node.js executable to use. Note that this is only used for the emscripten 116 -# target when running tests, otherwise this can be omitted. 117 -#nodejs = "node" 118 - 119 -# Python interpreter to use for various tasks throughout the build, notably 120 -# rustdoc tests, the lldb python interpreter, and some dist bits and pieces. 121 -# Note that Python 2 is currently required. 122 -#python = "python2.7" 123 - 124 -# Force Cargo to check that Cargo.lock describes the precise dependency 125 -# set that all the Cargo.toml files create, instead of updating it. 126 -#locked-deps = false 127 - 128 -# Indicate whether the vendored sources are used for Rust dependencies or not 129 -#vendor = false 130 - 131 -# Typically the build system will build the rust compiler twice. The second 132 -# compiler, however, will simply use its own libraries to link against. If you 133 -# would rather to perform a full bootstrap, compiling the compiler three times, 134 -# then you can set this option to true. You shouldn't ever need to set this 135 -# option to true. 136 -#full-bootstrap = false 137 - 138 -# Enable a build of the and extended rust tool set which is not only the 139 -# compiler but also tools such as Cargo. This will also produce "combined 140 -# installers" which are used to install Rust and Cargo together. This is 141 -# disabled by default. 142 -#extended = false 143 - 144 -# Verbosity level: 0 == not verbose, 1 == verbose, 2 == very verbose 145 -#verbose = 0 146 - 147 -# Build the sanitizer runtimes 148 -#sanitizers = false 149 - 150 -# Indicates whether the OpenSSL linked into Cargo will be statically linked or 151 -# not. If static linkage is specified then the build system will download a 152 -# known-good version of OpenSSL, compile it, and link it to Cargo. 153 -#openssl-static = false 154 - 155 -# ============================================================================= 156 -# General install configuration options 157 -# ============================================================================= 158 -[install] 159 - 160 -# Instead of installing to /usr/local, install to this path instead. 161 -#prefix = "/usr/local" 162 - 163 -# Where to install libraries in `prefix` above 164 -#libdir = "lib" 165 - 166 -# Where to install man pages in `prefix` above 167 -#mandir = "share/man" 168 - 169 -# Where to install documentation in `prefix` above 170 -#docdir = "share/doc/rust" 171 - 172 -# ============================================================================= 173 -# Options for compiling Rust code itself 174 -# ============================================================================= 175 -[rust] 176 - 177 -# Whether or not to optimize the compiler and standard library 178 -# Note: the slowness of the non optimized compiler compiling itself usually 179 -# outweighs the time gains in not doing optimizations, therefore a 180 -# full bootstrap takes much more time with optimize set to false. 181 -#optimize = true 182 - 183 -# Number of codegen units to use for each compiler invocation. A value of 0 184 -# means "the number of cores on this machine", and 1+ is passed through to the 185 -# compiler. 186 -#codegen-units = 1 187 - 188 -# Whether or not debug assertions are enabled for the compiler and standard 189 -# library 190 -#debug-assertions = false 191 - 192 -# Whether or not debuginfo is emitted 193 -#debuginfo = false 194 - 195 -# Whether or not line number debug information is emitted 196 -#debuginfo-lines = false 197 - 198 -# Whether or not to only build debuginfo for the standard library if enabled. 199 -# If enabled, this will not compile the compiler with debuginfo, just the 200 -# standard library. 201 -#debuginfo-only-std = false 202 - 203 -# Whether or not jemalloc is built and enabled 204 -#use-jemalloc = true 205 - 206 -# Whether or not jemalloc is built with its debug option set 207 -#debug-jemalloc = false 208 - 209 -# Whether or not `panic!`s generate backtraces (RUST_BACKTRACE) 210 -#backtrace = true 211 - 212 -# The default linker that will be used by the generated compiler. Note that this 213 -# is not the linker used to link said compiler. 214 -#default-linker = "cc" 215 - 216 -# The default ar utility that will be used by the generated compiler if LLVM 217 -# cannot be used. Note that this is not used to assemble said compiler. 218 -#default-ar = "ar" 219 - 220 -# The "channel" for the Rust build to produce. The stable/beta channels only 221 -# allow using stable features, whereas the nightly and dev channels allow using 222 -# nightly features 223 -#channel = "dev" 224 - 225 -# By default the `rustc` executable is built with `-Wl,-rpath` flags on Unix 226 -# platforms to ensure that the compiler is usable by default from the build 227 -# directory (as it links to a number of dynamic libraries). This may not be 228 -# desired in distributions, for example. 229 -#rpath = true 230 - 231 -# Flag indicating whether tests are compiled with optimizations (the -O flag) or 232 -# with debuginfo (the -g flag) 233 -#optimize-tests = true 234 -#debuginfo-tests = true 235 - 236 -# Flag indicating whether codegen tests will be run or not. If you get an error 237 -# saying that the FileCheck executable is missing, you may want to disable this. 238 -#codegen-tests = true 239 - 240 -# ============================================================================= 241 -# Options for specific targets 242 -# 243 -# Each of the following options is scoped to the specific target triple in 244 -# question and is used for determining how to compile each target. 245 -# ============================================================================= 246 -[target.x86_64-unknown-linux-gnu] 247 - 248 -# C compiler to be used to compiler C code and link Rust code. Note that the 249 -# default value is platform specific, and if not specified it may also depend on 250 -# what platform is crossing to what platform. 251 -#cc = "cc" 252 - 253 -# C++ compiler to be used to compiler C++ code (e.g. LLVM and our LLVM shims). 254 -# This is only used for host targets. 255 -#cxx = "c++" 256 - 257 -# Path to the `llvm-config` binary of the installation of a custom LLVM to link 258 -# against. Note that if this is specifed we don't compile LLVM at all for this 259 -# target. 260 -#llvm-config = "../path/to/llvm/root/bin/llvm-config" 261 - 262 -# Path to the custom jemalloc static library to link into the standard library 263 -# by default. This is only used if jemalloc is still enabled above 264 -#jemalloc = "/path/to/jemalloc/libjemalloc_pic.a" 265 - 266 -# If this target is for Android, this option will be required to specify where 267 -# the NDK for the target lives. This is used to find the C compiler to link and 268 -# build native code. 269 -#android-ndk = "/path/to/ndk" 270 - 271 -# The root location of the MUSL installation directory. The library directory 272 -# will also need to contain libunwind.a for an unwinding implementation. Note 273 -# that this option only makes sense for MUSL targets that produce statically 274 -# linked binaries 275 -#musl-root = "..." 276 - 277 -# ============================================================================= 278 -# Distribution options 279 -# 280 -# These options are related to distribution, mostly for the Rust project itself. 281 -# You probably won't need to concern yourself with any of these options 282 -# ============================================================================= 283 -[dist] 284 - 285 -# This is the folder of artifacts that the build system will sign. All files in 286 -# this directory will be signed with the default gpg key using the system `gpg` 287 -# binary. The `asc` and `sha256` files will all be output into the standard dist 288 -# output folder (currently `build/dist`) 289 -# 290 -# This folder should be populated ahead of time before the build system is 291 -# invoked. 292 -#sign-folder = "path/to/folder/to/sign" 293 - 294 -# This is a file which contains the password of the default gpg key. This will 295 -# be passed to `gpg` down the road when signing all files in `sign-folder` 296 -# above. This should be stored in plaintext. 297 -#gpg-password-file = "path/to/gpg/password" 298 - 299 -# The remote address that all artifacts will eventually be uploaded to. The 300 -# build system generates manifests which will point to these urls, and for the 301 -# manifests to be correct they'll have to have the right URLs encoded. 302 -# 303 -# Note that this address should not contain a trailing slash as file names will 304 -# be appended to it. 305 -#upload-addr = "https://example.com/folder"
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/dist.rs version [fe285cbc4e].
1 -// Copyright 2016 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Implementation of the various distribution aspects of the compiler. 12 -//! 13 -//! This module is responsible for creating tarballs of the standard library, 14 -//! compiler, and documentation. This ends up being what we distribute to 15 -//! everyone as well. 16 -//! 17 -//! No tarball is actually created literally in this file, but rather we shell 18 -//! out to `rust-installer` still. This may one day be replaced with bits and 19 -//! pieces of `rustup.rs`! 20 - 21 -use std::env; 22 -use std::fs::{self, File}; 23 -use std::io::{Read, Write}; 24 -use std::path::{PathBuf, Path}; 25 -use std::process::{Command, Stdio}; 26 - 27 -use build_helper::output; 28 - 29 -use {Build, Compiler, Mode}; 30 -use channel; 31 -use util::{cp_r, libdir, is_dylib, cp_filtered, copy, exe}; 32 - 33 -fn pkgname(build: &Build, component: &str) -> String { 34 - if component == "cargo" { 35 - format!("{}-{}", component, build.cargo_package_vers()) 36 - } else if component == "rls" { 37 - format!("{}-{}", component, build.package_vers(&build.release_num("rls"))) 38 - } else { 39 - assert!(component.starts_with("rust")); 40 - format!("{}-{}", component, build.rust_package_vers()) 41 - } 42 -} 43 - 44 -fn distdir(build: &Build) -> PathBuf { 45 - build.out.join("dist") 46 -} 47 - 48 -pub fn tmpdir(build: &Build) -> PathBuf { 49 - build.out.join("tmp/dist") 50 -} 51 - 52 -fn rust_installer(build: &Build) -> Command { 53 - build.tool_cmd(&Compiler::new(0, &build.config.build), "rust-installer") 54 -} 55 - 56 -/// Builds the `rust-docs` installer component. 57 -/// 58 -/// Slurps up documentation from the `stage`'s `host`. 59 -pub fn docs(build: &Build, stage: u32, host: &str) { 60 - println!("Dist docs stage{} ({})", stage, host); 61 - if !build.config.docs { 62 - println!("\tskipping - docs disabled"); 63 - return 64 - } 65 - 66 - let name = pkgname(build, "rust-docs"); 67 - let image = tmpdir(build).join(format!("{}-{}-image", name, host)); 68 - let _ = fs::remove_dir_all(&image); 69 - 70 - let dst = image.join("share/doc/rust/html"); 71 - t!(fs::create_dir_all(&dst)); 72 - let src = build.out.join(host).join("doc"); 73 - cp_r(&src, &dst); 74 - 75 - let mut cmd = rust_installer(build); 76 - cmd.arg("generate") 77 - .arg("--product-name=Rust-Documentation") 78 - .arg("--rel-manifest-dir=rustlib") 79 - .arg("--success-message=Rust-documentation-is-installed.") 80 - .arg("--image-dir").arg(&image) 81 - .arg("--work-dir").arg(&tmpdir(build)) 82 - .arg("--output-dir").arg(&distdir(build)) 83 - .arg(format!("--package-name={}-{}", name, host)) 84 - .arg("--component-name=rust-docs") 85 - .arg("--legacy-manifest-dirs=rustlib,cargo") 86 - .arg("--bulk-dirs=share/doc/rust/html"); 87 - build.run(&mut cmd); 88 - t!(fs::remove_dir_all(&image)); 89 - 90 - // As part of this step, *also* copy the docs directory to a directory which 91 - // buildbot typically uploads. 92 - if host == build.config.build { 93 - let dst = distdir(build).join("doc").join(build.rust_package_vers()); 94 - t!(fs::create_dir_all(&dst)); 95 - cp_r(&src, &dst); 96 - } 97 -} 98 - 99 -/// Build the `rust-mingw` installer component. 100 -/// 101 -/// This contains all the bits and pieces to run the MinGW Windows targets 102 -/// without any extra installed software (e.g. we bundle gcc, libraries, etc). 103 -/// Currently just shells out to a python script, but that should be rewritten 104 -/// in Rust. 105 -pub fn mingw(build: &Build, host: &str) { 106 - println!("Dist mingw ({})", host); 107 - let name = pkgname(build, "rust-mingw"); 108 - let image = tmpdir(build).join(format!("{}-{}-image", name, host)); 109 - let _ = fs::remove_dir_all(&image); 110 - t!(fs::create_dir_all(&image)); 111 - 112 - // The first argument to the script is a "temporary directory" which is just 113 - // thrown away (this contains the runtime DLLs included in the rustc package 114 - // above) and the second argument is where to place all the MinGW components 115 - // (which is what we want). 116 - // 117 - // FIXME: this script should be rewritten into Rust 118 - let mut cmd = Command::new(build.python()); 119 - cmd.arg(build.src.join("src/etc/make-win-dist.py")) 120 - .arg(tmpdir(build)) 121 - .arg(&image) 122 - .arg(host); 123 - build.run(&mut cmd); 124 - 125 - let mut cmd = rust_installer(build); 126 - cmd.arg("generate") 127 - .arg("--product-name=Rust-MinGW") 128 - .arg("--rel-manifest-dir=rustlib") 129 - .arg("--success-message=Rust-MinGW-is-installed.") 130 - .arg("--image-dir").arg(&image) 131 - .arg("--work-dir").arg(&tmpdir(build)) 132 - .arg("--output-dir").arg(&distdir(build)) 133 - .arg(format!("--package-name={}-{}", name, host)) 134 - .arg("--component-name=rust-mingw") 135 - .arg("--legacy-manifest-dirs=rustlib,cargo"); 136 - build.run(&mut cmd); 137 - t!(fs::remove_dir_all(&image)); 138 -} 139 - 140 -/// Creates the `rustc` installer component. 141 -pub fn rustc(build: &Build, stage: u32, host: &str) { 142 - println!("Dist rustc stage{} ({})", stage, host); 143 - let name = pkgname(build, "rustc"); 144 - let image = tmpdir(build).join(format!("{}-{}-image", name, host)); 145 - let _ = fs::remove_dir_all(&image); 146 - let overlay = tmpdir(build).join(format!("{}-{}-overlay", name, host)); 147 - let _ = fs::remove_dir_all(&overlay); 148 - 149 - // Prepare the rustc "image", what will actually end up getting installed 150 - prepare_image(build, stage, host, &image); 151 - 152 - // Prepare the overlay which is part of the tarball but won't actually be 153 - // installed 154 - let cp = |file: &str| { 155 - install(&build.src.join(file), &overlay, 0o644); 156 - }; 157 - cp("COPYRIGHT"); 158 - cp("LICENSE-APACHE"); 159 - cp("LICENSE-MIT"); 160 - cp("README.md"); 161 - // tiny morsel of metadata is used by rust-packaging 162 - let version = build.rust_version(); 163 - t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes())); 164 - 165 - // On MinGW we've got a few runtime DLL dependencies that we need to 166 - // include. The first argument to this script is where to put these DLLs 167 - // (the image we're creating), and the second argument is a junk directory 168 - // to ignore all other MinGW stuff the script creates. 169 - // 170 - // On 32-bit MinGW we're always including a DLL which needs some extra 171 - // licenses to distribute. On 64-bit MinGW we don't actually distribute 172 - // anything requiring us to distribute a license, but it's likely the 173 - // install will *also* include the rust-mingw package, which also needs 174 - // licenses, so to be safe we just include it here in all MinGW packages. 175 - // 176 - // FIXME: this script should be rewritten into Rust 177 - if host.contains("pc-windows-gnu") { 178 - let mut cmd = Command::new(build.python()); 179 - cmd.arg(build.src.join("src/etc/make-win-dist.py")) 180 - .arg(&image) 181 - .arg(tmpdir(build)) 182 - .arg(host); 183 - build.run(&mut cmd); 184 - 185 - let dst = image.join("share/doc"); 186 - t!(fs::create_dir_all(&dst)); 187 - cp_r(&build.src.join("src/etc/third-party"), &dst); 188 - } 189 - 190 - // Finally, wrap everything up in a nice tarball! 191 - let mut cmd = rust_installer(build); 192 - cmd.arg("generate") 193 - .arg("--product-name=Rust") 194 - .arg("--rel-manifest-dir=rustlib") 195 - .arg("--success-message=Rust-is-ready-to-roll.") 196 - .arg("--image-dir").arg(&image) 197 - .arg("--work-dir").arg(&tmpdir(build)) 198 - .arg("--output-dir").arg(&distdir(build)) 199 - .arg("--non-installed-overlay").arg(&overlay) 200 - .arg(format!("--package-name={}-{}", name, host)) 201 - .arg("--component-name=rustc") 202 - .arg("--legacy-manifest-dirs=rustlib,cargo"); 203 - build.run(&mut cmd); 204 - t!(fs::remove_dir_all(&image)); 205 - t!(fs::remove_dir_all(&overlay)); 206 - 207 - fn prepare_image(build: &Build, stage: u32, host: &str, image: &Path) { 208 - let src = build.sysroot(&Compiler::new(stage, host)); 209 - let libdir = libdir(host); 210 - 211 - // Copy rustc/rustdoc binaries 212 - t!(fs::create_dir_all(image.join("bin"))); 213 - cp_r(&src.join("bin"), &image.join("bin")); 214 - 215 - // Copy runtime DLLs needed by the compiler 216 - if libdir != "bin" { 217 - for entry in t!(src.join(libdir).read_dir()).map(|e| t!(e)) { 218 - let name = entry.file_name(); 219 - if let Some(s) = name.to_str() { 220 - if is_dylib(s) { 221 - install(&entry.path(), &image.join(libdir), 0o644); 222 - } 223 - } 224 - } 225 - } 226 - 227 - // Man pages 228 - t!(fs::create_dir_all(image.join("share/man/man1"))); 229 - cp_r(&build.src.join("man"), &image.join("share/man/man1")); 230 - 231 - // Debugger scripts 232 - debugger_scripts(build, &image, host); 233 - 234 - // Misc license info 235 - let cp = |file: &str| { 236 - install(&build.src.join(file), &image.join("share/doc/rust"), 0o644); 237 - }; 238 - cp("COPYRIGHT"); 239 - cp("LICENSE-APACHE"); 240 - cp("LICENSE-MIT"); 241 - cp("README.md"); 242 - } 243 -} 244 - 245 -/// Copies debugger scripts for `host` into the `sysroot` specified. 246 -pub fn debugger_scripts(build: &Build, 247 - sysroot: &Path, 248 - host: &str) { 249 - let cp_debugger_script = |file: &str| { 250 - let dst = sysroot.join("lib/rustlib/etc"); 251 - t!(fs::create_dir_all(&dst)); 252 - install(&build.src.join("src/etc/").join(file), &dst, 0o644); 253 - }; 254 - if host.contains("windows-msvc") { 255 - // windbg debugger scripts 256 - install(&build.src.join("src/etc/rust-windbg.cmd"), &sysroot.join("bin"), 257 - 0o755); 258 - 259 - cp_debugger_script("natvis/libcore.natvis"); 260 - cp_debugger_script("natvis/libcollections.natvis"); 261 - } else { 262 - cp_debugger_script("debugger_pretty_printers_common.py"); 263 - 264 - // gdb debugger scripts 265 - install(&build.src.join("src/etc/rust-gdb"), &sysroot.join("bin"), 266 - 0o755); 267 - 268 - cp_debugger_script("gdb_load_rust_pretty_printers.py"); 269 - cp_debugger_script("gdb_rust_pretty_printing.py"); 270 - 271 - // lldb debugger scripts 272 - install(&build.src.join("src/etc/rust-lldb"), &sysroot.join("bin"), 273 - 0o755); 274 - 275 - cp_debugger_script("lldb_rust_formatters.py"); 276 - } 277 -} 278 - 279 -/// Creates the `rust-std` installer component as compiled by `compiler` for the 280 -/// target `target`. 281 -pub fn std(build: &Build, compiler: &Compiler, target: &str) { 282 - println!("Dist std stage{} ({} -> {})", compiler.stage, compiler.host, 283 - target); 284 - 285 - // The only true set of target libraries came from the build triple, so 286 - // let's reduce redundant work by only producing archives from that host. 287 - if compiler.host != build.config.build { 288 - println!("\tskipping, not a build host"); 289 - return 290 - } 291 - 292 - let name = pkgname(build, "rust-std"); 293 - let image = tmpdir(build).join(format!("{}-{}-image", name, target)); 294 - let _ = fs::remove_dir_all(&image); 295 - 296 - let dst = image.join("lib/rustlib").join(target); 297 - t!(fs::create_dir_all(&dst)); 298 - let src = build.sysroot(compiler).join("lib/rustlib"); 299 - cp_r(&src.join(target), &dst); 300 - 301 - let mut cmd = rust_installer(build); 302 - cmd.arg("generate") 303 - .arg("--product-name=Rust") 304 - .arg("--rel-manifest-dir=rustlib") 305 - .arg("--success-message=std-is-standing-at-the-ready.") 306 - .arg("--image-dir").arg(&image) 307 - .arg("--work-dir").arg(&tmpdir(build)) 308 - .arg("--output-dir").arg(&distdir(build)) 309 - .arg(format!("--package-name={}-{}", name, target)) 310 - .arg(format!("--component-name=rust-std-{}", target)) 311 - .arg("--legacy-manifest-dirs=rustlib,cargo"); 312 - build.run(&mut cmd); 313 - t!(fs::remove_dir_all(&image)); 314 -} 315 - 316 -/// The path to the complete rustc-src tarball 317 -pub fn rust_src_location(build: &Build) -> PathBuf { 318 - let plain_name = format!("rustc-{}-src", build.rust_package_vers()); 319 - distdir(build).join(&format!("{}.tar.gz", plain_name)) 320 -} 321 - 322 -/// The path to the rust-src component installer 323 -pub fn rust_src_installer(build: &Build) -> PathBuf { 324 - let name = pkgname(build, "rust-src"); 325 - distdir(build).join(&format!("{}.tar.gz", name)) 326 -} 327 - 328 -/// Creates a tarball of save-analysis metadata, if available. 329 -pub fn analysis(build: &Build, compiler: &Compiler, target: &str) { 330 - assert!(build.config.extended); 331 - println!("Dist analysis"); 332 - 333 - if compiler.host != build.config.build { 334 - println!("\tskipping, not a build host"); 335 - return; 336 - } 337 - 338 - // Package save-analysis from stage1 if not doing a full bootstrap, as the 339 - // stage2 artifacts is simply copied from stage1 in that case. 340 - let compiler = if build.force_use_stage1(compiler, target) { 341 - Compiler::new(1, compiler.host) 342 - } else { 343 - compiler.clone() 344 - }; 345 - 346 - let name = pkgname(build, "rust-analysis"); 347 - let image = tmpdir(build).join(format!("{}-{}-image", name, target)); 348 - 349 - let src = build.stage_out(&compiler, Mode::Libstd).join(target).join("release").join("deps"); 350 - 351 - let image_src = src.join("save-analysis"); 352 - let dst = image.join("lib/rustlib").join(target).join("analysis"); 353 - t!(fs::create_dir_all(&dst)); 354 - println!("image_src: {:?}, dst: {:?}", image_src, dst); 355 - cp_r(&image_src, &dst); 356 - 357 - let mut cmd = rust_installer(build); 358 - cmd.arg("generate") 359 - .arg("--product-name=Rust") 360 - .arg("--rel-manifest-dir=rustlib") 361 - .arg("--success-message=save-analysis-saved.") 362 - .arg("--image-dir").arg(&image) 363 - .arg("--work-dir").arg(&tmpdir(build)) 364 - .arg("--output-dir").arg(&distdir(build)) 365 - .arg(format!("--package-name={}-{}", name, target)) 366 - .arg(format!("--component-name=rust-analysis-{}", target)) 367 - .arg("--legacy-manifest-dirs=rustlib,cargo"); 368 - build.run(&mut cmd); 369 - t!(fs::remove_dir_all(&image)); 370 -} 371 - 372 -const CARGO_VENDOR_VERSION: &'static str = "0.1.4"; 373 - 374 -/// Creates the `rust-src` installer component and the plain source tarball 375 -pub fn rust_src(build: &Build) { 376 - if !build.config.rust_dist_src { 377 - return 378 - } 379 - 380 - println!("Dist src"); 381 - 382 - // Make sure that the root folder of tarball has the correct name 383 - let plain_name = format!("rustc-{}-src", build.rust_package_vers()); 384 - let plain_dst_src = tmpdir(build).join(&plain_name); 385 - let _ = fs::remove_dir_all(&plain_dst_src); 386 - t!(fs::create_dir_all(&plain_dst_src)); 387 - 388 - // This is the set of root paths which will become part of the source package 389 - let src_files = [ 390 - "COPYRIGHT", 391 - "LICENSE-APACHE", 392 - "LICENSE-MIT", 393 - "CONTRIBUTING.md", 394 - "README.md", 395 - "RELEASES.md", 396 - "configure", 397 - "x.py", 398 - ]; 399 - let src_dirs = [ 400 - "man", 401 - "src", 402 - ]; 403 - 404 - let filter_fn = move |path: &Path| { 405 - let spath = match path.to_str() { 406 - Some(path) => path, 407 - None => return false, 408 - }; 409 - if spath.ends_with("~") || spath.ends_with(".pyc") { 410 - return false 411 - } 412 - if spath.contains("llvm/test") || spath.contains("llvm\\test") { 413 - if spath.ends_with(".ll") || 414 - spath.ends_with(".td") || 415 - spath.ends_with(".s") { 416 - return false 417 - } 418 - } 419 - 420 - let excludes = [ 421 - "CVS", "RCS", "SCCS", ".git", ".gitignore", ".gitmodules", 422 - ".gitattributes", ".cvsignore", ".svn", ".arch-ids", "{arch}", 423 - "=RELEASE-ID", "=meta-update", "=update", ".bzr", ".bzrignore", 424 - ".bzrtags", ".hg", ".hgignore", ".hgrags", "_darcs", 425 - ]; 426 - !path.iter() 427 - .map(|s| s.to_str().unwrap()) 428 - .any(|s| excludes.contains(&s)) 429 - }; 430 - 431 - // Copy the directories using our filter 432 - for item in &src_dirs { 433 - let dst = &plain_dst_src.join(item); 434 - t!(fs::create_dir(dst)); 435 - cp_filtered(&build.src.join(item), dst, &filter_fn); 436 - } 437 - // Copy the files normally 438 - for item in &src_files { 439 - copy(&build.src.join(item), &plain_dst_src.join(item)); 440 - } 441 - 442 - // If we're building from git sources, we need to vendor a complete distribution. 443 - if build.src_is_git { 444 - // Get cargo-vendor installed, if it isn't already. 445 - let mut has_cargo_vendor = false; 446 - let mut cmd = Command::new(&build.cargo); 447 - for line in output(cmd.arg("install").arg("--list")).lines() { 448 - has_cargo_vendor |= line.starts_with("cargo-vendor "); 449 - } 450 - if !has_cargo_vendor { 451 - let mut cmd = Command::new(&build.cargo); 452 - cmd.arg("install") 453 - .arg("--force") 454 - .arg("--debug") 455 - .arg("--vers").arg(CARGO_VENDOR_VERSION) 456 - .arg("cargo-vendor") 457 - .env("RUSTC", &build.rustc); 458 - build.run(&mut cmd); 459 - } 460 - 461 - // Vendor all Cargo dependencies 462 - let mut cmd = Command::new(&build.cargo); 463 - cmd.arg("vendor") 464 - .current_dir(&plain_dst_src.join("src")); 465 - build.run(&mut cmd); 466 - } 467 - 468 - // Create the version file 469 - write_file(&plain_dst_src.join("version"), build.rust_version().as_bytes()); 470 - 471 - // Create plain source tarball 472 - let mut tarball = rust_src_location(build); 473 - tarball.set_extension(""); // strip .gz 474 - tarball.set_extension(""); // strip .tar 475 - if let Some(dir) = tarball.parent() { 476 - t!(fs::create_dir_all(dir)); 477 - } 478 - let mut cmd = rust_installer(build); 479 - cmd.arg("tarball") 480 - .arg("--input").arg(&plain_name) 481 - .arg("--output").arg(&tarball) 482 - .arg("--work-dir=.") 483 - .current_dir(tmpdir(build)); 484 - build.run(&mut cmd); 485 - 486 - 487 - let name = pkgname(build, "rust-src"); 488 - let image = tmpdir(build).join(format!("{}-image", name)); 489 - let _ = fs::remove_dir_all(&image); 490 - 491 - let dst = image.join("lib/rustlib/src"); 492 - let dst_src = dst.join("rust"); 493 - t!(fs::create_dir_all(&dst_src)); 494 - 495 - // This is the reduced set of paths which will become the rust-src component 496 - // (essentially libstd and all of its path dependencies) 497 - let std_src_dirs = [ 498 - "src/build_helper", 499 - "src/liballoc", 500 - "src/liballoc_jemalloc", 501 - "src/liballoc_system", 502 - "src/libcollections", 503 - "src/libcompiler_builtins", 504 - "src/libcore", 505 - "src/liblibc", 506 - "src/libpanic_abort", 507 - "src/libpanic_unwind", 508 - "src/librand", 509 - "src/librustc_asan", 510 - "src/librustc_lsan", 511 - "src/librustc_msan", 512 - "src/librustc_tsan", 513 - "src/libstd", 514 - "src/libstd_unicode", 515 - "src/libunwind", 516 - "src/rustc/libc_shim", 517 - ]; 518 - 519 - for item in &std_src_dirs { 520 - let dst = &dst_src.join(item); 521 - t!(fs::create_dir_all(dst)); 522 - cp_r(&plain_dst_src.join(item), dst); 523 - } 524 - 525 - // Create source tarball in rust-installer format 526 - let mut cmd = rust_installer(build); 527 - cmd.arg("generate") 528 - .arg("--product-name=Rust") 529 - .arg("--rel-manifest-dir=rustlib") 530 - .arg("--success-message=Awesome-Source.") 531 - .arg("--image-dir").arg(&image) 532 - .arg("--work-dir").arg(&tmpdir(build)) 533 - .arg("--output-dir").arg(&distdir(build)) 534 - .arg(format!("--package-name={}", name)) 535 - .arg("--component-name=rust-src") 536 - .arg("--legacy-manifest-dirs=rustlib,cargo"); 537 - build.run(&mut cmd); 538 - 539 - t!(fs::remove_dir_all(&image)); 540 - t!(fs::remove_dir_all(&plain_dst_src)); 541 -} 542 - 543 -fn install(src: &Path, dstdir: &Path, perms: u32) { 544 - let dst = dstdir.join(src.file_name().unwrap()); 545 - t!(fs::create_dir_all(dstdir)); 546 - t!(fs::copy(src, &dst)); 547 - chmod(&dst, perms); 548 -} 549 - 550 -#[cfg(unix)] 551 -fn chmod(path: &Path, perms: u32) { 552 - use std::os::unix::fs::*; 553 - t!(fs::set_permissions(path, fs::Permissions::from_mode(perms))); 554 -} 555 -#[cfg(windows)] 556 -fn chmod(_path: &Path, _perms: u32) {} 557 - 558 -// We have to run a few shell scripts, which choke quite a bit on both `\` 559 -// characters and on `C:\` paths, so normalize both of them away. 560 -pub fn sanitize_sh(path: &Path) -> String { 561 - let path = path.to_str().unwrap().replace("\\", "/"); 562 - return change_drive(&path).unwrap_or(path); 563 - 564 - fn change_drive(s: &str) -> Option<String> { 565 - let mut ch = s.chars(); 566 - let drive = ch.next().unwrap_or('C'); 567 - if ch.next() != Some(':') { 568 - return None 569 - } 570 - if ch.next() != Some('/') { 571 - return None 572 - } 573 - Some(format!("/{}/{}", drive, &s[drive.len_utf8() + 2..])) 574 - } 575 -} 576 - 577 -fn write_file(path: &Path, data: &[u8]) { 578 - let mut vf = t!(fs::File::create(path)); 579 - t!(vf.write_all(data)); 580 -} 581 - 582 -pub fn cargo(build: &Build, stage: u32, target: &str) { 583 - println!("Dist cargo stage{} ({})", stage, target); 584 - let compiler = Compiler::new(stage, &build.config.build); 585 - 586 - let src = build.src.join("src/tools/cargo"); 587 - let etc = src.join("src/etc"); 588 - let release_num = build.release_num("cargo"); 589 - let name = pkgname(build, "cargo"); 590 - let version = build.cargo_info.version(build, &release_num); 591 - 592 - let tmp = tmpdir(build); 593 - let image = tmp.join("cargo-image"); 594 - drop(fs::remove_dir_all(&image)); 595 - t!(fs::create_dir_all(&image)); 596 - 597 - // Prepare the image directory 598 - t!(fs::create_dir_all(image.join("share/zsh/site-functions"))); 599 - t!(fs::create_dir_all(image.join("etc/bash_completions.d"))); 600 - let cargo = build.cargo_out(&compiler, Mode::Tool, target) 601 - .join(exe("cargo", target)); 602 - install(&cargo, &image.join("bin"), 0o755); 603 - for man in t!(etc.join("man").read_dir()) { 604 - let man = t!(man); 605 - install(&man.path(), &image.join("share/man/man1"), 0o644); 606 - } 607 - install(&etc.join("_cargo"), &image.join("share/zsh/site-functions"), 0o644); 608 - copy(&etc.join("cargo.bashcomp.sh"), 609 - &image.join("etc/bash_completions.d/cargo")); 610 - let doc = image.join("share/doc/cargo"); 611 - install(&src.join("README.md"), &doc, 0o644); 612 - install(&src.join("LICENSE-MIT"), &doc, 0o644); 613 - install(&src.join("LICENSE-APACHE"), &doc, 0o644); 614 - install(&src.join("LICENSE-THIRD-PARTY"), &doc, 0o644); 615 - 616 - // Prepare the overlay 617 - let overlay = tmp.join("cargo-overlay"); 618 - drop(fs::remove_dir_all(&overlay)); 619 - t!(fs::create_dir_all(&overlay)); 620 - install(&src.join("README.md"), &overlay, 0o644); 621 - install(&src.join("LICENSE-MIT"), &overlay, 0o644); 622 - install(&src.join("LICENSE-APACHE"), &overlay, 0o644); 623 - install(&src.join("LICENSE-THIRD-PARTY"), &overlay, 0o644); 624 - t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes())); 625 - 626 - // Generate the installer tarball 627 - let mut cmd = rust_installer(build); 628 - cmd.arg("generate") 629 - .arg("--product-name=Rust") 630 - .arg("--rel-manifest-dir=rustlib") 631 - .arg("--success-message=Rust-is-ready-to-roll.") 632 - .arg("--image-dir").arg(&image) 633 - .arg("--work-dir").arg(&tmpdir(build)) 634 - .arg("--output-dir").arg(&distdir(build)) 635 - .arg("--non-installed-overlay").arg(&overlay) 636 - .arg(format!("--package-name={}-{}", name, target)) 637 - .arg("--component-name=cargo") 638 - .arg("--legacy-manifest-dirs=rustlib,cargo"); 639 - build.run(&mut cmd); 640 -} 641 - 642 -pub fn rls(build: &Build, stage: u32, target: &str) { 643 - assert!(build.config.extended); 644 - println!("Dist RLS stage{} ({})", stage, target); 645 - let compiler = Compiler::new(stage, &build.config.build); 646 - 647 - let src = build.src.join("src/tools/rls"); 648 - let release_num = build.release_num("rls"); 649 - let name = pkgname(build, "rls"); 650 - let version = build.rls_info.version(build, &release_num); 651 - 652 - let tmp = tmpdir(build); 653 - let image = tmp.join("rls-image"); 654 - drop(fs::remove_dir_all(&image)); 655 - t!(fs::create_dir_all(&image)); 656 - 657 - // Prepare the image directory 658 - let rls = build.cargo_out(&compiler, Mode::Tool, target) 659 - .join(exe("rls", target)); 660 - install(&rls, &image.join("bin"), 0o755); 661 - let doc = image.join("share/doc/rls"); 662 - install(&src.join("README.md"), &doc, 0o644); 663 - install(&src.join("LICENSE-MIT"), &doc, 0o644); 664 - install(&src.join("LICENSE-APACHE"), &doc, 0o644); 665 - 666 - // Prepare the overlay 667 - let overlay = tmp.join("rls-overlay"); 668 - drop(fs::remove_dir_all(&overlay)); 669 - t!(fs::create_dir_all(&overlay)); 670 - install(&src.join("README.md"), &overlay, 0o644); 671 - install(&src.join("LICENSE-MIT"), &overlay, 0o644); 672 - install(&src.join("LICENSE-APACHE"), &overlay, 0o644); 673 - t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes())); 674 - 675 - // Generate the installer tarball 676 - let mut cmd = rust_installer(build); 677 - cmd.arg("generate") 678 - .arg("--product-name=Rust") 679 - .arg("--rel-manifest-dir=rustlib") 680 - .arg("--success-message=RLS-ready-to-serve.") 681 - .arg("--image-dir").arg(&image) 682 - .arg("--work-dir").arg(&tmpdir(build)) 683 - .arg("--output-dir").arg(&distdir(build)) 684 - .arg("--non-installed-overlay").arg(&overlay) 685 - .arg(format!("--package-name={}-{}", name, target)) 686 - .arg("--component-name=rls") 687 - .arg("--legacy-manifest-dirs=rustlib,cargo"); 688 - build.run(&mut cmd); 689 -} 690 - 691 -/// Creates a combined installer for the specified target in the provided stage. 692 -pub fn extended(build: &Build, stage: u32, target: &str) { 693 - println!("Dist extended stage{} ({})", stage, target); 694 - 695 - let dist = distdir(build); 696 - let rustc_installer = dist.join(format!("{}-{}.tar.gz", 697 - pkgname(build, "rustc"), 698 - target)); 699 - let cargo_installer = dist.join(format!("{}-{}.tar.gz", 700 - pkgname(build, "cargo"), 701 - target)); 702 - let rls_installer = dist.join(format!("{}-{}.tar.gz", 703 - pkgname(build, "rls"), 704 - target)); 705 - let analysis_installer = dist.join(format!("{}-{}.tar.gz", 706 - pkgname(build, "rust-analysis"), 707 - target)); 708 - let docs_installer = dist.join(format!("{}-{}.tar.gz", 709 - pkgname(build, "rust-docs"), 710 - target)); 711 - let mingw_installer = dist.join(format!("{}-{}.tar.gz", 712 - pkgname(build, "rust-mingw"), 713 - target)); 714 - let std_installer = dist.join(format!("{}-{}.tar.gz", 715 - pkgname(build, "rust-std"), 716 - target)); 717 - 718 - let tmp = tmpdir(build); 719 - let overlay = tmp.join("extended-overlay"); 720 - let etc = build.src.join("src/etc/installer"); 721 - let work = tmp.join("work"); 722 - 723 - let _ = fs::remove_dir_all(&overlay); 724 - install(&build.src.join("COPYRIGHT"), &overlay, 0o644); 725 - install(&build.src.join("LICENSE-APACHE"), &overlay, 0o644); 726 - install(&build.src.join("LICENSE-MIT"), &overlay, 0o644); 727 - let version = build.rust_version(); 728 - t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes())); 729 - install(&etc.join("README.md"), &overlay, 0o644); 730 - 731 - // When rust-std package split from rustc, we needed to ensure that during 732 - // upgrades rustc was upgraded before rust-std. To avoid rustc clobbering 733 - // the std files during uninstall. To do this ensure that rustc comes 734 - // before rust-std in the list below. 735 - let mut tarballs = vec![rustc_installer, cargo_installer, rls_installer, 736 - analysis_installer, docs_installer, std_installer]; 737 - if target.contains("pc-windows-gnu") { 738 - tarballs.push(mingw_installer); 739 - } 740 - let mut input_tarballs = tarballs[0].as_os_str().to_owned(); 741 - for tarball in &tarballs[1..] { 742 - input_tarballs.push(","); 743 - input_tarballs.push(tarball); 744 - } 745 - 746 - let mut cmd = rust_installer(build); 747 - cmd.arg("combine") 748 - .arg("--product-name=Rust") 749 - .arg("--rel-manifest-dir=rustlib") 750 - .arg("--success-message=Rust-is-ready-to-roll.") 751 - .arg("--work-dir").arg(&work) 752 - .arg("--output-dir").arg(&distdir(build)) 753 - .arg(format!("--package-name={}-{}", pkgname(build, "rust"), target)) 754 - .arg("--legacy-manifest-dirs=rustlib,cargo") 755 - .arg("--input-tarballs").arg(input_tarballs) 756 - .arg("--non-installed-overlay").arg(&overlay); 757 - build.run(&mut cmd); 758 - 759 - let mut license = String::new(); 760 - t!(t!(File::open(build.src.join("COPYRIGHT"))).read_to_string(&mut license)); 761 - license.push_str("\n"); 762 - t!(t!(File::open(build.src.join("LICENSE-APACHE"))).read_to_string(&mut license)); 763 - license.push_str("\n"); 764 - t!(t!(File::open(build.src.join("LICENSE-MIT"))).read_to_string(&mut license)); 765 - 766 - let rtf = r"{\rtf1\ansi\deff0{\fonttbl{\f0\fnil\fcharset0 Arial;}}\nowwrap\fs18"; 767 - let mut rtf = rtf.to_string(); 768 - rtf.push_str("\n"); 769 - for line in license.lines() { 770 - rtf.push_str(line); 771 - rtf.push_str("\\line "); 772 - } 773 - rtf.push_str("}"); 774 - 775 - if target.contains("apple-darwin") { 776 - let pkg = tmp.join("pkg"); 777 - let _ = fs::remove_dir_all(&pkg); 778 - t!(fs::create_dir_all(pkg.join("rustc"))); 779 - t!(fs::create_dir_all(pkg.join("cargo"))); 780 - t!(fs::create_dir_all(pkg.join("rust-docs"))); 781 - t!(fs::create_dir_all(pkg.join("rust-std"))); 782 - 783 - cp_r(&work.join(&format!("{}-{}", pkgname(build, "rustc"), target)), 784 - &pkg.join("rustc")); 785 - cp_r(&work.join(&format!("{}-{}", pkgname(build, "cargo"), target)), 786 - &pkg.join("cargo")); 787 - cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-docs"), target)), 788 - &pkg.join("rust-docs")); 789 - cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-std"), target)), 790 - &pkg.join("rust-std")); 791 - 792 - install(&etc.join("pkg/postinstall"), &pkg.join("rustc"), 0o755); 793 - install(&etc.join("pkg/postinstall"), &pkg.join("cargo"), 0o755); 794 - install(&etc.join("pkg/postinstall"), &pkg.join("rust-docs"), 0o755); 795 - install(&etc.join("pkg/postinstall"), &pkg.join("rust-std"), 0o755); 796 - 797 - let pkgbuild = |component: &str| { 798 - let mut cmd = Command::new("pkgbuild"); 799 - cmd.arg("--identifier").arg(format!("org.rust-lang.{}", component)) 800 - .arg("--scripts").arg(pkg.join(component)) 801 - .arg("--nopayload") 802 - .arg(pkg.join(component).with_extension("pkg")); 803 - build.run(&mut cmd); 804 - }; 805 - pkgbuild("rustc"); 806 - pkgbuild("cargo"); 807 - pkgbuild("rust-docs"); 808 - pkgbuild("rust-std"); 809 - 810 - // create an 'uninstall' package 811 - install(&etc.join("pkg/postinstall"), &pkg.join("uninstall"), 0o755); 812 - pkgbuild("uninstall"); 813 - 814 - t!(fs::create_dir_all(pkg.join("res"))); 815 - t!(t!(File::create(pkg.join("res/LICENSE.txt"))).write_all(license.as_bytes())); 816 - install(&etc.join("gfx/rust-logo.png"), &pkg.join("res"), 0o644); 817 - let mut cmd = Command::new("productbuild"); 818 - cmd.arg("--distribution").arg(etc.join("pkg/Distribution.xml")) 819 - .arg("--resources").arg(pkg.join("res")) 820 - .arg(distdir(build).join(format!("{}-{}.pkg", 821 - pkgname(build, "rust"), 822 - target))) 823 - .arg("--package-path").arg(&pkg); 824 - build.run(&mut cmd); 825 - } 826 - 827 - if target.contains("windows") { 828 - let exe = tmp.join("exe"); 829 - let _ = fs::remove_dir_all(&exe); 830 - t!(fs::create_dir_all(exe.join("rustc"))); 831 - t!(fs::create_dir_all(exe.join("cargo"))); 832 - t!(fs::create_dir_all(exe.join("rust-docs"))); 833 - t!(fs::create_dir_all(exe.join("rust-std"))); 834 - cp_r(&work.join(&format!("{}-{}", pkgname(build, "rustc"), target)) 835 - .join("rustc"), 836 - &exe.join("rustc")); 837 - cp_r(&work.join(&format!("{}-{}", pkgname(build, "cargo"), target)) 838 - .join("cargo"), 839 - &exe.join("cargo")); 840 - cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-docs"), target)) 841 - .join("rust-docs"), 842 - &exe.join("rust-docs")); 843 - cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-std"), target)) 844 - .join(format!("rust-std-{}", target)), 845 - &exe.join("rust-std")); 846 - 847 - t!(fs::remove_file(exe.join("rustc/manifest.in"))); 848 - t!(fs::remove_file(exe.join("cargo/manifest.in"))); 849 - t!(fs::remove_file(exe.join("rust-docs/manifest.in"))); 850 - t!(fs::remove_file(exe.join("rust-std/manifest.in"))); 851 - 852 - if target.contains("windows-gnu") { 853 - t!(fs::create_dir_all(exe.join("rust-mingw"))); 854 - cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-mingw"), target)) 855 - .join("rust-mingw"), 856 - &exe.join("rust-mingw")); 857 - t!(fs::remove_file(exe.join("rust-mingw/manifest.in"))); 858 - } 859 - 860 - install(&etc.join("exe/rust.iss"), &exe, 0o644); 861 - install(&etc.join("exe/modpath.iss"), &exe, 0o644); 862 - install(&etc.join("exe/upgrade.iss"), &exe, 0o644); 863 - install(&etc.join("gfx/rust-logo.ico"), &exe, 0o644); 864 - t!(t!(File::create(exe.join("LICENSE.txt"))).write_all(license.as_bytes())); 865 - 866 - // Generate exe installer 867 - let mut cmd = Command::new("iscc"); 868 - cmd.arg("rust.iss") 869 - .current_dir(&exe); 870 - if target.contains("windows-gnu") { 871 - cmd.arg("/dMINGW"); 872 - } 873 - add_env(build, &mut cmd, target); 874 - build.run(&mut cmd); 875 - install(&exe.join(format!("{}-{}.exe", pkgname(build, "rust"), target)), 876 - &distdir(build), 877 - 0o755); 878 - 879 - // Generate msi installer 880 - let wix = PathBuf::from(env::var_os("WIX").unwrap()); 881 - let heat = wix.join("bin/heat.exe"); 882 - let candle = wix.join("bin/candle.exe"); 883 - let light = wix.join("bin/light.exe"); 884 - 885 - let heat_flags = ["-nologo", "-gg", "-sfrag", "-srd", "-sreg"]; 886 - build.run(Command::new(&heat) 887 - .current_dir(&exe) 888 - .arg("dir") 889 - .arg("rustc") 890 - .args(&heat_flags) 891 - .arg("-cg").arg("RustcGroup") 892 - .arg("-dr").arg("Rustc") 893 - .arg("-var").arg("var.RustcDir") 894 - .arg("-out").arg(exe.join("RustcGroup.wxs"))); 895 - build.run(Command::new(&heat) 896 - .current_dir(&exe) 897 - .arg("dir") 898 - .arg("rust-docs") 899 - .args(&heat_flags) 900 - .arg("-cg").arg("DocsGroup") 901 - .arg("-dr").arg("Docs") 902 - .arg("-var").arg("var.DocsDir") 903 - .arg("-out").arg(exe.join("DocsGroup.wxs")) 904 - .arg("-t").arg(etc.join("msi/squash-components.xsl"))); 905 - build.run(Command::new(&heat) 906 - .current_dir(&exe) 907 - .arg("dir") 908 - .arg("cargo") 909 - .args(&heat_flags) 910 - .arg("-cg").arg("CargoGroup") 911 - .arg("-dr").arg("Cargo") 912 - .arg("-var").arg("var.CargoDir") 913 - .arg("-out").arg(exe.join("CargoGroup.wxs")) 914 - .arg("-t").arg(etc.join("msi/remove-duplicates.xsl"))); 915 - build.run(Command::new(&heat) 916 - .current_dir(&exe) 917 - .arg("dir") 918 - .arg("rust-std") 919 - .args(&heat_flags) 920 - .arg("-cg").arg("StdGroup") 921 - .arg("-dr").arg("Std") 922 - .arg("-var").arg("var.StdDir") 923 - .arg("-out").arg(exe.join("StdGroup.wxs"))); 924 - if target.contains("windows-gnu") { 925 - build.run(Command::new(&heat) 926 - .current_dir(&exe) 927 - .arg("dir") 928 - .arg("rust-mingw") 929 - .args(&heat_flags) 930 - .arg("-cg").arg("GccGroup") 931 - .arg("-dr").arg("Gcc") 932 - .arg("-var").arg("var.GccDir") 933 - .arg("-out").arg(exe.join("GccGroup.wxs"))); 934 - } 935 - 936 - let candle = |input: &Path| { 937 - let output = exe.join(input.file_stem().unwrap()) 938 - .with_extension("wixobj"); 939 - let arch = if target.contains("x86_64") {"x64"} else {"x86"}; 940 - let mut cmd = Command::new(&candle); 941 - cmd.current_dir(&exe) 942 - .arg("-nologo") 943 - .arg("-dRustcDir=rustc") 944 - .arg("-dDocsDir=rust-docs") 945 - .arg("-dCargoDir=cargo") 946 - .arg("-dStdDir=rust-std") 947 - .arg("-arch").arg(&arch) 948 - .arg("-out").arg(&output) 949 - .arg(&input); 950 - add_env(build, &mut cmd, target); 951 - 952 - if target.contains("windows-gnu") { 953 - cmd.arg("-dGccDir=rust-mingw"); 954 - } 955 - build.run(&mut cmd); 956 - }; 957 - candle(&etc.join("msi/rust.wxs")); 958 - candle(&etc.join("msi/ui.wxs")); 959 - candle(&etc.join("msi/rustwelcomedlg.wxs")); 960 - candle("RustcGroup.wxs".as_ref()); 961 - candle("DocsGroup.wxs".as_ref()); 962 - candle("CargoGroup.wxs".as_ref()); 963 - candle("StdGroup.wxs".as_ref()); 964 - 965 - if target.contains("windows-gnu") { 966 - candle("GccGroup.wxs".as_ref()); 967 - } 968 - 969 - t!(t!(File::create(exe.join("LICENSE.rtf"))).write_all(rtf.as_bytes())); 970 - install(&etc.join("gfx/banner.bmp"), &exe, 0o644); 971 - install(&etc.join("gfx/dialogbg.bmp"), &exe, 0o644); 972 - 973 - let filename = format!("{}-{}.msi", pkgname(build, "rust"), target); 974 - let mut cmd = Command::new(&light); 975 - cmd.arg("-nologo") 976 - .arg("-ext").arg("WixUIExtension") 977 - .arg("-ext").arg("WixUtilExtension") 978 - .arg("-out").arg(exe.join(&filename)) 979 - .arg("rust.wixobj") 980 - .arg("ui.wixobj") 981 - .arg("rustwelcomedlg.wixobj") 982 - .arg("RustcGroup.wixobj") 983 - .arg("DocsGroup.wixobj") 984 - .arg("CargoGroup.wixobj") 985 - .arg("StdGroup.wixobj") 986 - .current_dir(&exe); 987 - 988 - if target.contains("windows-gnu") { 989 - cmd.arg("GccGroup.wixobj"); 990 - } 991 - // ICE57 wrongly complains about the shortcuts 992 - cmd.arg("-sice:ICE57"); 993 - 994 - build.run(&mut cmd); 995 - 996 - t!(fs::rename(exe.join(&filename), distdir(build).join(&filename))); 997 - } 998 -} 999 - 1000 -fn add_env(build: &Build, cmd: &mut Command, target: &str) { 1001 - let mut parts = channel::CFG_RELEASE_NUM.split('.'); 1002 - cmd.env("CFG_RELEASE_INFO", build.rust_version()) 1003 - .env("CFG_RELEASE_NUM", channel::CFG_RELEASE_NUM) 1004 - .env("CFG_RELEASE", build.rust_release()) 1005 - .env("CFG_PRERELEASE_VERSION", channel::CFG_PRERELEASE_VERSION) 1006 - .env("CFG_VER_MAJOR", parts.next().unwrap()) 1007 - .env("CFG_VER_MINOR", parts.next().unwrap()) 1008 - .env("CFG_VER_PATCH", parts.next().unwrap()) 1009 - .env("CFG_VER_BUILD", "0") // just needed to build 1010 - .env("CFG_PACKAGE_VERS", build.rust_package_vers()) 1011 - .env("CFG_PACKAGE_NAME", pkgname(build, "rust")) 1012 - .env("CFG_BUILD", target) 1013 - .env("CFG_CHANNEL", &build.config.channel); 1014 - 1015 - if target.contains("windows-gnu") { 1016 - cmd.env("CFG_MINGW", "1") 1017 - .env("CFG_ABI", "GNU"); 1018 - } else { 1019 - cmd.env("CFG_MINGW", "0") 1020 - .env("CFG_ABI", "MSVC"); 1021 - } 1022 - 1023 - if target.contains("x86_64") { 1024 - cmd.env("CFG_PLATFORM", "x64"); 1025 - } else { 1026 - cmd.env("CFG_PLATFORM", "x86"); 1027 - } 1028 -} 1029 - 1030 -pub fn hash_and_sign(build: &Build) { 1031 - let compiler = Compiler::new(0, &build.config.build); 1032 - let mut cmd = build.tool_cmd(&compiler, "build-manifest"); 1033 - let sign = build.config.dist_sign_folder.as_ref().unwrap_or_else(|| { 1034 - panic!("\n\nfailed to specify `dist.sign-folder` in `config.toml`\n\n") 1035 - }); 1036 - let addr = build.config.dist_upload_addr.as_ref().unwrap_or_else(|| { 1037 - panic!("\n\nfailed to specify `dist.upload-addr` in `config.toml`\n\n") 1038 - }); 1039 - let file = build.config.dist_gpg_password_file.as_ref().unwrap_or_else(|| { 1040 - panic!("\n\nfailed to specify `dist.gpg-password-file` in `config.toml`\n\n") 1041 - }); 1042 - let mut pass = String::new(); 1043 - t!(t!(File::open(&file)).read_to_string(&mut pass)); 1044 - 1045 - let today = output(Command::new("date").arg("+%Y-%m-%d")); 1046 - 1047 - cmd.arg(sign); 1048 - cmd.arg(distdir(build)); 1049 - cmd.arg(today.trim()); 1050 - cmd.arg(build.rust_package_vers()); 1051 - cmd.arg(build.package_vers(&build.release_num("cargo"))); 1052 - cmd.arg(build.package_vers(&build.release_num("rls"))); 1053 - cmd.arg(addr); 1054 - 1055 - t!(fs::create_dir_all(distdir(build))); 1056 - 1057 - let mut child = t!(cmd.stdin(Stdio::piped()).spawn()); 1058 - t!(child.stdin.take().unwrap().write_all(pass.as_bytes())); 1059 - let status = t!(child.wait()); 1060 - assert!(status.success()); 1061 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/doc.rs version [fcdfc4a443].
1 -// Copyright 2016 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Documentation generation for rustbuild. 12 -//! 13 -//! This module implements generation for all bits and pieces of documentation 14 -//! for the Rust project. This notably includes suites like the rust book, the 15 -//! nomicon, standalone documentation, etc. 16 -//! 17 -//! Everything here is basically just a shim around calling either `rustbook` or 18 -//! `rustdoc`. 19 - 20 -use std::fs::{self, File}; 21 -use std::io::prelude::*; 22 -use std::io; 23 -use std::path::Path; 24 -use std::process::Command; 25 - 26 -use {Build, Compiler, Mode}; 27 -use util::{cp_r, symlink_dir}; 28 -use build_helper::up_to_date; 29 - 30 -/// Invoke `rustbook` as compiled in `stage` for `target` for the doc book 31 -/// `name` into the `out` path. 32 -/// 33 -/// This will not actually generate any documentation if the documentation has 34 -/// already been generated. 35 -pub fn rustbook(build: &Build, target: &str, name: &str) { 36 - let out = build.doc_out(target); 37 - t!(fs::create_dir_all(&out)); 38 - 39 - let out = out.join(name); 40 - let compiler = Compiler::new(0, &build.config.build); 41 - let src = build.src.join("src/doc").join(name); 42 - let index = out.join("index.html"); 43 - let rustbook = build.tool(&compiler, "rustbook"); 44 - if up_to_date(&src, &index) && up_to_date(&rustbook, &index) { 45 - return 46 - } 47 - println!("Rustbook ({}) - {}", target, name); 48 - let _ = fs::remove_dir_all(&out); 49 - build.run(build.tool_cmd(&compiler, "rustbook") 50 - .arg("build") 51 - .arg(&src) 52 - .arg("-d") 53 - .arg(out)); 54 -} 55 - 56 -/// Build the book and associated stuff. 57 -/// 58 -/// We need to build: 59 -/// 60 -/// * Book (first edition) 61 -/// * Book (second edition) 62 -/// * Index page 63 -/// * Redirect pages 64 -pub fn book(build: &Build, target: &str, name: &str) { 65 - // build book first edition 66 - rustbook(build, target, &format!("{}/first-edition", name)); 67 - 68 - // build book second edition 69 - rustbook(build, target, &format!("{}/second-edition", name)); 70 - 71 - // build the index page 72 - let index = format!("{}/index.md", name); 73 - println!("Documenting book index ({})", target); 74 - invoke_rustdoc(build, target, &index); 75 - 76 - // build the redirect pages 77 - println!("Documenting book redirect pages ({})", target); 78 - for file in t!(fs::read_dir(build.src.join("src/doc/book/redirects"))) { 79 - let file = t!(file); 80 - let path = file.path(); 81 - let path = path.to_str().unwrap(); 82 - 83 - invoke_rustdoc(build, target, path); 84 - } 85 -} 86 - 87 -fn invoke_rustdoc(build: &Build, target: &str, markdown: &str) { 88 - let out = build.doc_out(target); 89 - 90 - let compiler = Compiler::new(0, &build.config.build); 91 - 92 - let path = build.src.join("src/doc").join(markdown); 93 - 94 - let rustdoc = build.rustdoc(&compiler); 95 - 96 - let favicon = build.src.join("src/doc/favicon.inc"); 97 - let footer = build.src.join("src/doc/footer.inc"); 98 - 99 - let version_input = build.src.join("src/doc/version_info.html.template"); 100 - let version_info = out.join("version_info.html"); 101 - 102 - if !up_to_date(&version_input, &version_info) { 103 - let mut info = String::new(); 104 - t!(t!(File::open(&version_input)).read_to_string(&mut info)); 105 - let info = info.replace("VERSION", &build.rust_release()) 106 - .replace("SHORT_HASH", build.rust_info.sha_short().unwrap_or("")) 107 - .replace("STAMP", build.rust_info.sha().unwrap_or("")); 108 - t!(t!(File::create(&version_info)).write_all(info.as_bytes())); 109 - } 110 - 111 - let mut cmd = Command::new(&rustdoc); 112 - 113 - build.add_rustc_lib_path(&compiler, &mut cmd); 114 - 115 - let out = out.join("book"); 116 - 117 - t!(fs::copy(build.src.join("src/doc/rust.css"), out.join("rust.css"))); 118 - 119 - cmd.arg("--html-after-content").arg(&footer) 120 - .arg("--html-before-content").arg(&version_info) 121 - .arg("--html-in-header").arg(&favicon) 122 - .arg("--markdown-playground-url") 123 - .arg("https://play.rust-lang.org/") 124 - .arg("-o").arg(&out) 125 - .arg(&path) 126 - .arg("--markdown-css") 127 - .arg("rust.css"); 128 - 129 - build.run(&mut cmd); 130 -} 131 - 132 -/// Generates all standalone documentation as compiled by the rustdoc in `stage` 133 -/// for the `target` into `out`. 134 -/// 135 -/// This will list all of `src/doc` looking for markdown files and appropriately 136 -/// perform transformations like substituting `VERSION`, `SHORT_HASH`, and 137 -/// `STAMP` alongw ith providing the various header/footer HTML we've cutomized. 138 -/// 139 -/// In the end, this is just a glorified wrapper around rustdoc! 140 -pub fn standalone(build: &Build, target: &str) { 141 - println!("Documenting standalone ({})", target); 142 - let out = build.doc_out(target); 143 - t!(fs::create_dir_all(&out)); 144 - 145 - let compiler = Compiler::new(0, &build.config.build); 146 - 147 - let favicon = build.src.join("src/doc/favicon.inc"); 148 - let footer = build.src.join("src/doc/footer.inc"); 149 - let full_toc = build.src.join("src/doc/full-toc.inc"); 150 - t!(fs::copy(build.src.join("src/doc/rust.css"), out.join("rust.css"))); 151 - 152 - let version_input = build.src.join("src/doc/version_info.html.template"); 153 - let version_info = out.join("version_info.html"); 154 - 155 - if !up_to_date(&version_input, &version_info) { 156 - let mut info = String::new(); 157 - t!(t!(File::open(&version_input)).read_to_string(&mut info)); 158 - let info = info.replace("VERSION", &build.rust_release()) 159 - .replace("SHORT_HASH", build.rust_info.sha_short().unwrap_or("")) 160 - .replace("STAMP", build.rust_info.sha().unwrap_or("")); 161 - t!(t!(File::create(&version_info)).write_all(info.as_bytes())); 162 - } 163 - 164 - for file in t!(fs::read_dir(build.src.join("src/doc"))) { 165 - let file = t!(file); 166 - let path = file.path(); 167 - let filename = path.file_name().unwrap().to_str().unwrap(); 168 - if !filename.ends_with(".md") || filename == "README.md" { 169 - continue 170 - } 171 - 172 - let html = out.join(filename).with_extension("html"); 173 - let rustdoc = build.rustdoc(&compiler); 174 - if up_to_date(&path, &html) && 175 - up_to_date(&footer, &html) && 176 - up_to_date(&favicon, &html) && 177 - up_to_date(&full_toc, &html) && 178 - up_to_date(&version_info, &html) && 179 - up_to_date(&rustdoc, &html) { 180 - continue 181 - } 182 - 183 - let mut cmd = Command::new(&rustdoc); 184 - build.add_rustc_lib_path(&compiler, &mut cmd); 185 - cmd.arg("--html-after-content").arg(&footer) 186 - .arg("--html-before-content").arg(&version_info) 187 - .arg("--html-in-header").arg(&favicon) 188 - .arg("--markdown-playground-url") 189 - .arg("https://play.rust-lang.org/") 190 - .arg("-o").arg(&out) 191 - .arg(&path); 192 - 193 - if filename == "not_found.md" { 194 - cmd.arg("--markdown-no-toc") 195 - .arg("--markdown-css") 196 - .arg("https://doc.rust-lang.org/rust.css"); 197 - } else { 198 - cmd.arg("--markdown-css").arg("rust.css"); 199 - } 200 - build.run(&mut cmd); 201 - } 202 -} 203 - 204 -/// Compile all standard library documentation. 205 -/// 206 -/// This will generate all documentation for the standard library and its 207 -/// dependencies. This is largely just a wrapper around `cargo doc`. 208 -pub fn std(build: &Build, stage: u32, target: &str) { 209 - println!("Documenting stage{} std ({})", stage, target); 210 - let out = build.doc_out(target); 211 - t!(fs::create_dir_all(&out)); 212 - let compiler = Compiler::new(stage, &build.config.build); 213 - let compiler = if build.force_use_stage1(&compiler, target) { 214 - Compiler::new(1, compiler.host) 215 - } else { 216 - compiler 217 - }; 218 - let out_dir = build.stage_out(&compiler, Mode::Libstd) 219 - .join(target).join("doc"); 220 - let rustdoc = build.rustdoc(&compiler); 221 - 222 - // Here what we're doing is creating a *symlink* (directory junction on 223 - // Windows) to the final output location. This is not done as an 224 - // optimization but rather for correctness. We've got three trees of 225 - // documentation, one for std, one for test, and one for rustc. It's then 226 - // our job to merge them all together. 227 - // 228 - // Unfortunately rustbuild doesn't know nearly as well how to merge doc 229 - // trees as rustdoc does itself, so instead of actually having three 230 - // separate trees we just have rustdoc output to the same location across 231 - // all of them. 232 - // 233 - // This way rustdoc generates output directly into the output, and rustdoc 234 - // will also directly handle merging. 235 - let my_out = build.crate_doc_out(target); 236 - build.clear_if_dirty(&my_out, &rustdoc); 237 - t!(symlink_dir_force(&my_out, &out_dir)); 238 - 239 - let mut cargo = build.cargo(&compiler, Mode::Libstd, target, "doc"); 240 - cargo.arg("--manifest-path") 241 - .arg(build.src.join("src/libstd/Cargo.toml")) 242 - .arg("--features").arg(build.std_features()); 243 - 244 - // We don't want to build docs for internal std dependencies unless 245 - // in compiler-docs mode. When not in that mode, we whitelist the crates 246 - // for which docs must be built. 247 - if !build.config.compiler_docs { 248 - cargo.arg("--no-deps"); 249 - for krate in &["alloc", "collections", "core", "std", "std_unicode"] { 250 - cargo.arg("-p").arg(krate); 251 - // Create all crate output directories first to make sure rustdoc uses 252 - // relative links. 253 - // FIXME: Cargo should probably do this itself. 254 - t!(fs::create_dir_all(out_dir.join(krate))); 255 - } 256 - } 257 - 258 - 259 - build.run(&mut cargo); 260 - cp_r(&my_out, &out); 261 -} 262 - 263 -/// Compile all libtest documentation. 264 -/// 265 -/// This will generate all documentation for libtest and its dependencies. This 266 -/// is largely just a wrapper around `cargo doc`. 267 -pub fn test(build: &Build, stage: u32, target: &str) { 268 - println!("Documenting stage{} test ({})", stage, target); 269 - let out = build.doc_out(target); 270 - t!(fs::create_dir_all(&out)); 271 - let compiler = Compiler::new(stage, &build.config.build); 272 - let compiler = if build.force_use_stage1(&compiler, target) { 273 - Compiler::new(1, compiler.host) 274 - } else { 275 - compiler 276 - }; 277 - let out_dir = build.stage_out(&compiler, Mode::Libtest) 278 - .join(target).join("doc"); 279 - let rustdoc = build.rustdoc(&compiler); 280 - 281 - // See docs in std above for why we symlink 282 - let my_out = build.crate_doc_out(target); 283 - build.clear_if_dirty(&my_out, &rustdoc); 284 - t!(symlink_dir_force(&my_out, &out_dir)); 285 - 286 - let mut cargo = build.cargo(&compiler, Mode::Libtest, target, "doc"); 287 - cargo.arg("--manifest-path") 288 - .arg(build.src.join("src/libtest/Cargo.toml")); 289 - build.run(&mut cargo); 290 - cp_r(&my_out, &out); 291 -} 292 - 293 -/// Generate all compiler documentation. 294 -/// 295 -/// This will generate all documentation for the compiler libraries and their 296 -/// dependencies. This is largely just a wrapper around `cargo doc`. 297 -pub fn rustc(build: &Build, stage: u32, target: &str) { 298 - println!("Documenting stage{} compiler ({})", stage, target); 299 - let out = build.doc_out(target); 300 - t!(fs::create_dir_all(&out)); 301 - let compiler = Compiler::new(stage, &build.config.build); 302 - let compiler = if build.force_use_stage1(&compiler, target) { 303 - Compiler::new(1, compiler.host) 304 - } else { 305 - compiler 306 - }; 307 - let out_dir = build.stage_out(&compiler, Mode::Librustc) 308 - .join(target).join("doc"); 309 - let rustdoc = build.rustdoc(&compiler); 310 - 311 - // See docs in std above for why we symlink 312 - let my_out = build.crate_doc_out(target); 313 - build.clear_if_dirty(&my_out, &rustdoc); 314 - t!(symlink_dir_force(&my_out, &out_dir)); 315 - 316 - let mut cargo = build.cargo(&compiler, Mode::Librustc, target, "doc"); 317 - cargo.arg("--manifest-path") 318 - .arg(build.src.join("src/rustc/Cargo.toml")) 319 - .arg("--features").arg(build.rustc_features()); 320 - 321 - if build.config.compiler_docs { 322 - // src/rustc/Cargo.toml contains bin crates called rustc and rustdoc 323 - // which would otherwise overwrite the docs for the real rustc and 324 - // rustdoc lib crates. 325 - cargo.arg("-p").arg("rustc_driver") 326 - .arg("-p").arg("rustdoc"); 327 - } else { 328 - // Like with libstd above if compiler docs aren't enabled then we're not 329 - // documenting internal dependencies, so we have a whitelist. 330 - cargo.arg("--no-deps"); 331 - for krate in &["proc_macro"] { 332 - cargo.arg("-p").arg(krate); 333 - } 334 - } 335 - 336 - build.run(&mut cargo); 337 - cp_r(&my_out, &out); 338 -} 339 - 340 -/// Generates the HTML rendered error-index by running the 341 -/// `error_index_generator` tool. 342 -pub fn error_index(build: &Build, target: &str) { 343 - println!("Documenting error index ({})", target); 344 - let out = build.doc_out(target); 345 - t!(fs::create_dir_all(&out)); 346 - let compiler = Compiler::new(0, &build.config.build); 347 - let mut index = build.tool_cmd(&compiler, "error_index_generator"); 348 - index.arg("html"); 349 - index.arg(out.join("error-index.html")); 350 - 351 - // FIXME: shouldn't have to pass this env var 352 - index.env("CFG_BUILD", &build.config.build); 353 - 354 - build.run(&mut index); 355 -} 356 - 357 -fn symlink_dir_force(src: &Path, dst: &Path) -> io::Result<()> { 358 - if let Ok(m) = fs::symlink_metadata(dst) { 359 - if m.file_type().is_dir() { 360 - try!(fs::remove_dir_all(dst)); 361 - } else { 362 - // handle directory junctions on windows by falling back to 363 - // `remove_dir`. 364 - try!(fs::remove_file(dst).or_else(|_| { 365 - fs::remove_dir(dst) 366 - })); 367 - } 368 - } 369 - 370 - symlink_dir(src, dst) 371 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/flags.rs version [a9988039d3].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Command-line interface of the rustbuild build system. 12 -//! 13 -//! This module implements the command-line parsing of the build system which 14 -//! has various flags to configure how it's run. 15 - 16 -use std::env; 17 -use std::fs; 18 -use std::path::PathBuf; 19 -use std::process; 20 - 21 -use getopts::Options; 22 - 23 -use Build; 24 -use config::Config; 25 -use metadata; 26 -use step; 27 - 28 -/// Deserialized version of all flags for this compile. 29 -pub struct Flags { 30 - pub verbose: usize, // verbosity level: 0 == not verbose, 1 == verbose, 2 == very verbose 31 - pub on_fail: Option<String>, 32 - pub stage: Option<u32>, 33 - pub keep_stage: Option<u32>, 34 - pub build: String, 35 - pub host: Vec<String>, 36 - pub target: Vec<String>, 37 - pub config: Option<PathBuf>, 38 - pub src: Option<PathBuf>, 39 - pub jobs: Option<u32>, 40 - pub cmd: Subcommand, 41 - pub incremental: bool, 42 -} 43 - 44 -impl Flags { 45 - pub fn verbose(&self) -> bool { 46 - self.verbose > 0 47 - } 48 - 49 - pub fn very_verbose(&self) -> bool { 50 - self.verbose > 1 51 - } 52 -} 53 - 54 -pub enum Subcommand { 55 - Build { 56 - paths: Vec<PathBuf>, 57 - }, 58 - Doc { 59 - paths: Vec<PathBuf>, 60 - }, 61 - Test { 62 - paths: Vec<PathBuf>, 63 - test_args: Vec<String>, 64 - }, 65 - Bench { 66 - paths: Vec<PathBuf>, 67 - test_args: Vec<String>, 68 - }, 69 - Clean, 70 - Dist { 71 - paths: Vec<PathBuf>, 72 - install: bool, 73 - }, 74 -} 75 - 76 -impl Flags { 77 - pub fn parse(args: &[String]) -> Flags { 78 - let mut extra_help = String::new(); 79 - let mut subcommand_help = format!("\ 80 -Usage: x.py <subcommand> [options] [<paths>...] 81 - 82 -Subcommands: 83 - build Compile either the compiler or libraries 84 - test Build and run some test suites 85 - bench Build and run some benchmarks 86 - doc Build documentation 87 - clean Clean out build directories 88 - dist Build and/or install distribution artifacts 89 - 90 -To learn more about a subcommand, run `./x.py <subcommand> -h`"); 91 - 92 - let mut opts = Options::new(); 93 - // Options common to all subcommands 94 - opts.optflagmulti("v", "verbose", "use verbose output (-vv for very verbose)"); 95 - opts.optflag("i", "incremental", "use incremental compilation"); 96 - opts.optopt("", "config", "TOML configuration file for build", "FILE"); 97 - opts.optopt("", "build", "build target of the stage0 compiler", "BUILD"); 98 - opts.optmulti("", "host", "host targets to build", "HOST"); 99 - opts.optmulti("", "target", "target targets to build", "TARGET"); 100 - opts.optopt("", "on-fail", "command to run on failure", "CMD"); 101 - opts.optopt("", "stage", "stage to build", "N"); 102 - opts.optopt("", "keep-stage", "stage to keep without recompiling", "N"); 103 - opts.optopt("", "src", "path to the root of the rust checkout", "DIR"); 104 - opts.optopt("j", "jobs", "number of jobs to run in parallel", "JOBS"); 105 - opts.optflag("h", "help", "print this help message"); 106 - 107 - // fn usage() 108 - let usage = |exit_code: i32, opts: &Options, subcommand_help: &str, extra_help: &str| -> ! { 109 - println!("{}", opts.usage(subcommand_help)); 110 - if !extra_help.is_empty() { 111 - println!("{}", extra_help); 112 - } 113 - process::exit(exit_code); 114 - }; 115 - 116 - // We can't use getopt to parse the options until we have completed specifying which 117 - // options are valid, but under the current implementation, some options are conditional on 118 - // the subcommand. Therefore we must manually identify the subcommand first, so that we can 119 - // complete the definition of the options. Then we can use the getopt::Matches object from 120 - // there on out. 121 - let mut possible_subcommands = args.iter().collect::<Vec<_>>(); 122 - possible_subcommands.retain(|&s| 123 - (s == "build") 124 - || (s == "test") 125 - || (s == "bench") 126 - || (s == "doc") 127 - || (s == "clean") 128 - || (s == "dist")); 129 - let subcommand = match possible_subcommands.first() { 130 - Some(s) => s, 131 - None => { 132 - // No subcommand -- show the general usage and subcommand help 133 - println!("{}\n", subcommand_help); 134 - process::exit(0); 135 - } 136 - }; 137 - 138 - // Some subcommands get extra options 139 - match subcommand.as_str() { 140 - "test" => { opts.optmulti("", "test-args", "extra arguments", "ARGS"); }, 141 - "bench" => { opts.optmulti("", "test-args", "extra arguments", "ARGS"); }, 142 - "dist" => { opts.optflag("", "install", "run installer as well"); }, 143 - _ => { }, 144 - }; 145 - 146 - // Done specifying what options are possible, so do the getopts parsing 147 - let matches = opts.parse(&args[..]).unwrap_or_else(|e| { 148 - // Invalid argument/option format 149 - println!("\n{}\n", e); 150 - usage(1, &opts, &subcommand_help, &extra_help); 151 - }); 152 - // Extra sanity check to make sure we didn't hit this crazy corner case: 153 - // 154 - // ./x.py --frobulate clean build 155 - // ^-- option ^ ^- actual subcommand 156 - // \_ arg to option could be mistaken as subcommand 157 - let mut pass_sanity_check = true; 158 - match matches.free.get(0) { 159 - Some(check_subcommand) => { 160 - if &check_subcommand != subcommand { 161 - pass_sanity_check = false; 162 - } 163 - }, 164 - None => { 165 - pass_sanity_check = false; 166 - } 167 - } 168 - if !pass_sanity_check { 169 - println!("{}\n", subcommand_help); 170 - println!("Sorry, I couldn't figure out which subcommand you were trying to specify.\n\ 171 - You may need to move some options to after the subcommand.\n"); 172 - process::exit(1); 173 - } 174 - // Extra help text for some commands 175 - match subcommand.as_str() { 176 - "build" => { 177 - subcommand_help.push_str("\n 178 -Arguments: 179 - This subcommand accepts a number of paths to directories to the crates 180 - and/or artifacts to compile. For example: 181 - 182 - ./x.py build src/libcore 183 - ./x.py build src/libcore src/libproc_macro 184 - ./x.py build src/libstd --stage 1 185 - 186 - If no arguments are passed then the complete artifacts for that stage are 187 - also compiled. 188 - 189 - ./x.py build 190 - ./x.py build --stage 1 191 - 192 - For a quick build with a usable compile, you can pass: 193 - 194 - ./x.py build --stage 1 src/libtest"); 195 - } 196 - "test" => { 197 - subcommand_help.push_str("\n 198 -Arguments: 199 - This subcommand accepts a number of paths to directories to tests that 200 - should be compiled and run. For example: 201 - 202 - ./x.py test src/test/run-pass 203 - ./x.py test src/libstd --test-args hash_map 204 - ./x.py test src/libstd --stage 0 205 - 206 - If no arguments are passed then the complete artifacts for that stage are 207 - compiled and tested. 208 - 209 - ./x.py test 210 - ./x.py test --stage 1"); 211 - } 212 - "doc" => { 213 - subcommand_help.push_str("\n 214 -Arguments: 215 - This subcommand accepts a number of paths to directories of documentation 216 - to build. For example: 217 - 218 - ./x.py doc src/doc/book 219 - ./x.py doc src/doc/nomicon 220 - ./x.py doc src/doc/book src/libstd 221 - 222 - If no arguments are passed then everything is documented: 223 - 224 - ./x.py doc 225 - ./x.py doc --stage 1"); 226 - } 227 - _ => { } 228 - }; 229 - // Get any optional paths which occur after the subcommand 230 - let cwd = t!(env::current_dir()); 231 - let paths = matches.free[1..].iter().map(|p| cwd.join(p)).collect::<Vec<_>>(); 232 - 233 - 234 - // All subcommands can have an optional "Available paths" section 235 - if matches.opt_present("verbose") { 236 - let flags = Flags::parse(&["build".to_string()]); 237 - let mut config = Config::default(); 238 - config.build = flags.build.clone(); 239 - let mut build = Build::new(flags, config); 240 - metadata::build(&mut build); 241 - let maybe_rules_help = step::build_rules(&build).get_help(subcommand); 242 - if maybe_rules_help.is_some() { 243 - extra_help.push_str(maybe_rules_help.unwrap().as_str()); 244 - } 245 - } else { 246 - extra_help.push_str(format!("Run `./x.py {} -h -v` to see a list of available paths.", 247 - subcommand).as_str()); 248 - } 249 - 250 - // User passed in -h/--help? 251 - if matches.opt_present("help") { 252 - usage(0, &opts, &subcommand_help, &extra_help); 253 - } 254 - 255 - let cmd = match subcommand.as_str() { 256 - "build" => { 257 - Subcommand::Build { paths: paths } 258 - } 259 - "test" => { 260 - Subcommand::Test { 261 - paths: paths, 262 - test_args: matches.opt_strs("test-args"), 263 - } 264 - } 265 - "bench" => { 266 - Subcommand::Bench { 267 - paths: paths, 268 - test_args: matches.opt_strs("test-args"), 269 - } 270 - } 271 - "doc" => { 272 - Subcommand::Doc { paths: paths } 273 - } 274 - "clean" => { 275 - if paths.len() > 0 { 276 - println!("\nclean takes no arguments\n"); 277 - usage(1, &opts, &subcommand_help, &extra_help); 278 - } 279 - Subcommand::Clean 280 - } 281 - "dist" => { 282 - Subcommand::Dist { 283 - paths: paths, 284 - install: matches.opt_present("install"), 285 - } 286 - } 287 - _ => { 288 - usage(1, &opts, &subcommand_help, &extra_help); 289 - } 290 - }; 291 - 292 - 293 - let cfg_file = matches.opt_str("config").map(PathBuf::from).or_else(|| { 294 - if fs::metadata("config.toml").is_ok() { 295 - Some(PathBuf::from("config.toml")) 296 - } else { 297 - None 298 - } 299 - }); 300 - 301 - let mut stage = matches.opt_str("stage").map(|j| j.parse().unwrap()); 302 - 303 - if matches.opt_present("incremental") { 304 - if stage.is_none() { 305 - stage = Some(1); 306 - } 307 - } 308 - 309 - Flags { 310 - verbose: matches.opt_count("verbose"), 311 - stage: stage, 312 - on_fail: matches.opt_str("on-fail"), 313 - keep_stage: matches.opt_str("keep-stage").map(|j| j.parse().unwrap()), 314 - build: matches.opt_str("build").unwrap_or_else(|| { 315 - env::var("BUILD").unwrap() 316 - }), 317 - host: split(matches.opt_strs("host")), 318 - target: split(matches.opt_strs("target")), 319 - config: cfg_file, 320 - src: matches.opt_str("src").map(PathBuf::from), 321 - jobs: matches.opt_str("jobs").map(|j| j.parse().unwrap()), 322 - cmd: cmd, 323 - incremental: matches.opt_present("incremental"), 324 - } 325 - } 326 -} 327 - 328 -impl Subcommand { 329 - pub fn test_args(&self) -> Vec<&str> { 330 - match *self { 331 - Subcommand::Test { ref test_args, .. } | 332 - Subcommand::Bench { ref test_args, .. } => { 333 - test_args.iter().flat_map(|s| s.split_whitespace()).collect() 334 - } 335 - _ => Vec::new(), 336 - } 337 - } 338 -} 339 - 340 -fn split(s: Vec<String>) -> Vec<String> { 341 - s.iter().flat_map(|s| s.split(',')).map(|s| s.to_string()).collect() 342 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/install.rs version [7f9a2c0617].
1 -// Copyright 2016 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Implementation of the install aspects of the compiler. 12 -//! 13 -//! This module is responsible for installing the standard library, 14 -//! compiler, and documentation. 15 - 16 -use std::env; 17 -use std::fs; 18 -use std::path::{Path, PathBuf, Component}; 19 -use std::process::Command; 20 - 21 -use Build; 22 -use dist::{sanitize_sh, tmpdir}; 23 - 24 -/// Installs everything. 25 -pub fn install(build: &Build, stage: u32, host: &str) { 26 - let prefix_default = PathBuf::from("/usr/local"); 27 - let docdir_default = PathBuf::from("share/doc/rust"); 28 - let mandir_default = PathBuf::from("share/man"); 29 - let libdir_default = PathBuf::from("lib"); 30 - let prefix = build.config.prefix.as_ref().unwrap_or(&prefix_default); 31 - let docdir = build.config.docdir.as_ref().unwrap_or(&docdir_default); 32 - let libdir = build.config.libdir.as_ref().unwrap_or(&libdir_default); 33 - let mandir = build.config.mandir.as_ref().unwrap_or(&mandir_default); 34 - 35 - let docdir = prefix.join(docdir); 36 - let libdir = prefix.join(libdir); 37 - let mandir = prefix.join(mandir); 38 - 39 - let destdir = env::var_os("DESTDIR").map(PathBuf::from); 40 - 41 - let prefix = add_destdir(&prefix, &destdir); 42 - let docdir = add_destdir(&docdir, &destdir); 43 - let libdir = add_destdir(&libdir, &destdir); 44 - let mandir = add_destdir(&mandir, &destdir); 45 - 46 - let empty_dir = build.out.join("tmp/empty_dir"); 47 - t!(fs::create_dir_all(&empty_dir)); 48 - if build.config.docs { 49 - install_sh(&build, "docs", "rust-docs", &build.rust_package_vers(), 50 - stage, host, &prefix, &docdir, &libdir, &mandir, &empty_dir); 51 - } 52 - 53 - for target in build.config.target.iter() { 54 - install_sh(&build, "std", "rust-std", &build.rust_package_vers(), 55 - stage, target, &prefix, &docdir, &libdir, &mandir, &empty_dir); 56 - } 57 - 58 - if build.config.extended { 59 - install_sh(&build, "cargo", "cargo", &build.cargo_package_vers(), 60 - stage, host, &prefix, &docdir, &libdir, &mandir, &empty_dir); 61 - install_sh(&build, "rls", "rls", &build.rls_package_vers(), 62 - stage, host, &prefix, &docdir, &libdir, &mandir, &empty_dir); 63 - } 64 - 65 - install_sh(&build, "rustc", "rustc", &build.rust_package_vers(), 66 - stage, host, &prefix, &docdir, &libdir, &mandir, &empty_dir); 67 - 68 - t!(fs::remove_dir_all(&empty_dir)); 69 -} 70 - 71 -fn install_sh(build: &Build, package: &str, name: &str, version: &str, stage: u32, host: &str, 72 - prefix: &Path, docdir: &Path, libdir: &Path, mandir: &Path, empty_dir: &Path) { 73 - println!("Install {} stage{} ({})", package, stage, host); 74 - let package_name = format!("{}-{}-{}", name, version, host); 75 - 76 - let mut cmd = Command::new("sh"); 77 - cmd.current_dir(empty_dir) 78 - .arg(sanitize_sh(&tmpdir(build).join(&package_name).join("install.sh"))) 79 - .arg(format!("--prefix={}", sanitize_sh(prefix))) 80 - .arg(format!("--docdir={}", sanitize_sh(docdir))) 81 - .arg(format!("--libdir={}", sanitize_sh(libdir))) 82 - .arg(format!("--mandir={}", sanitize_sh(mandir))) 83 - .arg("--disable-ldconfig"); 84 - build.run(&mut cmd); 85 -} 86 - 87 -fn add_destdir(path: &Path, destdir: &Option<PathBuf>) -> PathBuf { 88 - let mut ret = match *destdir { 89 - Some(ref dest) => dest.clone(), 90 - None => return path.to_path_buf(), 91 - }; 92 - for part in path.components() { 93 - match part { 94 - Component::Normal(s) => ret.push(s), 95 - _ => {} 96 - } 97 - } 98 - return ret 99 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/job.rs version [beec712824].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Job management on Windows for bootstrapping 12 -//! 13 -//! Most of the time when you're running a build system (e.g. make) you expect 14 -//! Ctrl-C or abnormal termination to actually terminate the entire tree of 15 -//! process in play, not just the one at the top. This currently works "by 16 -//! default" on Unix platforms because Ctrl-C actually sends a signal to the 17 -//! *process group* rather than the parent process, so everything will get torn 18 -//! down. On Windows, however, this does not happen and Ctrl-C just kills the 19 -//! parent process. 20 -//! 21 -//! To achieve the same semantics on Windows we use Job Objects to ensure that 22 -//! all processes die at the same time. Job objects have a mode of operation 23 -//! where when all handles to the object are closed it causes all child 24 -//! processes associated with the object to be terminated immediately. 25 -//! Conveniently whenever a process in the job object spawns a new process the 26 -//! child will be associated with the job object as well. This means if we add 27 -//! ourselves to the job object we create then everything will get torn down! 28 -//! 29 -//! Unfortunately most of the time the build system is actually called from a 30 -//! python wrapper (which manages things like building the build system) so this 31 -//! all doesn't quite cut it so far. To go the last mile we duplicate the job 32 -//! object handle into our parent process (a python process probably) and then 33 -//! close our own handle. This means that the only handle to the job object 34 -//! resides in the parent python process, so when python dies the whole build 35 -//! system dies (as one would probably expect!). 36 -//! 37 -//! Note that this module has a #[cfg(windows)] above it as none of this logic 38 -//! is required on Unix. 39 - 40 -#![allow(bad_style, dead_code)] 41 - 42 -use std::env; 43 -use std::io; 44 -use std::mem; 45 - 46 -type HANDLE = *mut u8; 47 -type BOOL = i32; 48 -type DWORD = u32; 49 -type LPHANDLE = *mut HANDLE; 50 -type LPVOID = *mut u8; 51 -type JOBOBJECTINFOCLASS = i32; 52 -type SIZE_T = usize; 53 -type LARGE_INTEGER = i64; 54 -type UINT = u32; 55 -type ULONG_PTR = usize; 56 -type ULONGLONG = u64; 57 - 58 -const FALSE: BOOL = 0; 59 -const DUPLICATE_SAME_ACCESS: DWORD = 0x2; 60 -const PROCESS_DUP_HANDLE: DWORD = 0x40; 61 -const JobObjectExtendedLimitInformation: JOBOBJECTINFOCLASS = 9; 62 -const JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE: DWORD = 0x2000; 63 -const SEM_FAILCRITICALERRORS: UINT = 0x0001; 64 -const SEM_NOGPFAULTERRORBOX: UINT = 0x0002; 65 - 66 -extern "system" { 67 - fn CreateJobObjectW(lpJobAttributes: *mut u8, lpName: *const u8) -> HANDLE; 68 - fn CloseHandle(hObject: HANDLE) -> BOOL; 69 - fn GetCurrentProcess() -> HANDLE; 70 - fn OpenProcess(dwDesiredAccess: DWORD, 71 - bInheritHandle: BOOL, 72 - dwProcessId: DWORD) -> HANDLE; 73 - fn DuplicateHandle(hSourceProcessHandle: HANDLE, 74 - hSourceHandle: HANDLE, 75 - hTargetProcessHandle: HANDLE, 76 - lpTargetHandle: LPHANDLE, 77 - dwDesiredAccess: DWORD, 78 - bInheritHandle: BOOL, 79 - dwOptions: DWORD) -> BOOL; 80 - fn AssignProcessToJobObject(hJob: HANDLE, hProcess: HANDLE) -> BOOL; 81 - fn SetInformationJobObject(hJob: HANDLE, 82 - JobObjectInformationClass: JOBOBJECTINFOCLASS, 83 - lpJobObjectInformation: LPVOID, 84 - cbJobObjectInformationLength: DWORD) -> BOOL; 85 - fn SetErrorMode(mode: UINT) -> UINT; 86 -} 87 - 88 -#[repr(C)] 89 -struct JOBOBJECT_EXTENDED_LIMIT_INFORMATION { 90 - BasicLimitInformation: JOBOBJECT_BASIC_LIMIT_INFORMATION, 91 - IoInfo: IO_COUNTERS, 92 - ProcessMemoryLimit: SIZE_T, 93 - JobMemoryLimit: SIZE_T, 94 - PeakProcessMemoryUsed: SIZE_T, 95 - PeakJobMemoryUsed: SIZE_T, 96 -} 97 - 98 -#[repr(C)] 99 -struct IO_COUNTERS { 100 - ReadOperationCount: ULONGLONG, 101 - WriteOperationCount: ULONGLONG, 102 - OtherOperationCount: ULONGLONG, 103 - ReadTransferCount: ULONGLONG, 104 - WriteTransferCount: ULONGLONG, 105 - OtherTransferCount: ULONGLONG, 106 -} 107 - 108 -#[repr(C)] 109 -struct JOBOBJECT_BASIC_LIMIT_INFORMATION { 110 - PerProcessUserTimeLimit: LARGE_INTEGER, 111 - PerJobUserTimeLimit: LARGE_INTEGER, 112 - LimitFlags: DWORD, 113 - MinimumWorkingsetSize: SIZE_T, 114 - MaximumWorkingsetSize: SIZE_T, 115 - ActiveProcessLimit: DWORD, 116 - Affinity: ULONG_PTR, 117 - PriorityClass: DWORD, 118 - SchedulingClass: DWORD, 119 -} 120 - 121 -pub unsafe fn setup() { 122 - // Tell Windows to not show any UI on errors (such as not finding a required dll 123 - // during startup or terminating abnormally). This is important for running tests, 124 - // since some of them use abnormal termination by design. 125 - // This mode is inherited by all child processes. 126 - let mode = SetErrorMode(SEM_NOGPFAULTERRORBOX); // read inherited flags 127 - SetErrorMode(mode | SEM_FAILCRITICALERRORS | SEM_NOGPFAULTERRORBOX); 128 - 129 - // Create a new job object for us to use 130 - let job = CreateJobObjectW(0 as *mut _, 0 as *const _); 131 - assert!(job != 0 as *mut _, "{}", io::Error::last_os_error()); 132 - 133 - // Indicate that when all handles to the job object are gone that all 134 - // process in the object should be killed. Note that this includes our 135 - // entire process tree by default because we've added ourselves and our 136 - // children will reside in the job by default. 137 - let mut info = mem::zeroed::<JOBOBJECT_EXTENDED_LIMIT_INFORMATION>(); 138 - info.BasicLimitInformation.LimitFlags = JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE; 139 - let r = SetInformationJobObject(job, 140 - JobObjectExtendedLimitInformation, 141 - &mut info as *mut _ as LPVOID, 142 - mem::size_of_val(&info) as DWORD); 143 - assert!(r != 0, "{}", io::Error::last_os_error()); 144 - 145 - // Assign our process to this job object. Note that if this fails, one very 146 - // likely reason is that we are ourselves already in a job object! This can 147 - // happen on the build bots that we've got for Windows, or if just anyone 148 - // else is instrumenting the build. In this case we just bail out 149 - // immediately and assume that they take care of it. 150 - // 151 - // Also note that nested jobs (why this might fail) are supported in recent 152 - // versions of Windows, but the version of Windows that our bots are running 153 - // at least don't support nested job objects. 154 - let r = AssignProcessToJobObject(job, GetCurrentProcess()); 155 - if r == 0 { 156 - CloseHandle(job); 157 - return 158 - } 159 - 160 - // If we've got a parent process (e.g. the python script that called us) 161 - // then move ownership of this job object up to them. That way if the python 162 - // script is killed (e.g. via ctrl-c) then we'll all be torn down. 163 - // 164 - // If we don't have a parent (e.g. this was run directly) then we 165 - // intentionally leak the job object handle. When our process exits 166 - // (normally or abnormally) it will close the handle implicitly, causing all 167 - // processes in the job to be cleaned up. 168 - let pid = match env::var("BOOTSTRAP_PARENT_ID") { 169 - Ok(s) => s, 170 - Err(..) => return, 171 - }; 172 - 173 - let parent = OpenProcess(PROCESS_DUP_HANDLE, FALSE, pid.parse().unwrap()); 174 - assert!(parent != 0 as *mut _, "{}", io::Error::last_os_error()); 175 - let mut parent_handle = 0 as *mut _; 176 - let r = DuplicateHandle(GetCurrentProcess(), job, 177 - parent, &mut parent_handle, 178 - 0, FALSE, DUPLICATE_SAME_ACCESS); 179 - 180 - // If this failed, well at least we tried! An example of DuplicateHandle 181 - // failing in the past has been when the wrong python2 package spawed this 182 - // build system (e.g. the `python2` package in MSYS instead of 183 - // `mingw-w64-x86_64-python2`. Not sure why it failed, but the "failure 184 - // mode" here is that we only clean everything up when the build system 185 - // dies, not when the python parent does, so not too bad. 186 - if r != 0 { 187 - CloseHandle(job); 188 - } 189 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/lib.rs version [02c7460c58].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Implementation of rustbuild, the Rust build system. 12 -//! 13 -//! This module, and its descendants, are the implementation of the Rust build 14 -//! system. Most of this build system is backed by Cargo but the outer layer 15 -//! here serves as the ability to orchestrate calling Cargo, sequencing Cargo 16 -//! builds, building artifacts like LLVM, etc. The goals of rustbuild are: 17 -//! 18 -//! * To be an easily understandable, easily extensible, and maintainable build 19 -//! system. 20 -//! * Leverage standard tools in the Rust ecosystem to build the compiler, aka 21 -//! crates.io and Cargo. 22 -//! * A standard interface to build across all platforms, including MSVC 23 -//! 24 -//! ## Architecture 25 -//! 26 -//! Although this build system defers most of the complicated logic to Cargo 27 -//! itself, it still needs to maintain a list of targets and dependencies which 28 -//! it can itself perform. Rustbuild is made up of a list of rules with 29 -//! dependencies amongst them (created in the `step` module) and then knows how 30 -//! to execute each in sequence. Each time rustbuild is invoked, it will simply 31 -//! iterate through this list of steps and execute each serially in turn. For 32 -//! each step rustbuild relies on the step internally being incremental and 33 -//! parallel. Note, though, that the `-j` parameter to rustbuild gets forwarded 34 -//! to appropriate test harnesses and such. 35 -//! 36 -//! Most of the "meaty" steps that matter are backed by Cargo, which does indeed 37 -//! have its own parallelism and incremental management. Later steps, like 38 -//! tests, aren't incremental and simply run the entire suite currently. 39 -//! 40 -//! When you execute `x.py build`, the steps which are executed are: 41 -//! 42 -//! * First, the python script is run. This will automatically download the 43 -//! stage0 rustc and cargo according to `src/stage0.txt`, or using the cached 44 -//! versions if they're available. These are then used to compile rustbuild 45 -//! itself (using Cargo). Finally, control is then transferred to rustbuild. 46 -//! 47 -//! * Rustbuild takes over, performs sanity checks, probes the environment, 48 -//! reads configuration, builds up a list of steps, and then starts executing 49 -//! them. 50 -//! 51 -//! * The stage0 libstd is compiled 52 -//! * The stage0 libtest is compiled 53 -//! * The stage0 librustc is compiled 54 -//! * The stage1 compiler is assembled 55 -//! * The stage1 libstd, libtest, librustc are compiled 56 -//! * The stage2 compiler is assembled 57 -//! * The stage2 libstd, libtest, librustc are compiled 58 -//! 59 -//! Each step is driven by a separate Cargo project and rustbuild orchestrates 60 -//! copying files between steps and otherwise preparing for Cargo to run. 61 -//! 62 -//! ## Further information 63 -//! 64 -//! More documentation can be found in each respective module below, and you can 65 -//! also check out the `src/bootstrap/README.md` file for more information. 66 - 67 -#![deny(warnings)] 68 - 69 -#[macro_use] 70 -extern crate build_helper; 71 -extern crate cmake; 72 -extern crate filetime; 73 -extern crate gcc; 74 -extern crate getopts; 75 -extern crate num_cpus; 76 -extern crate rustc_serialize; 77 -extern crate toml; 78 - 79 -use std::cmp; 80 -use std::collections::HashMap; 81 -use std::env; 82 -use std::ffi::OsString; 83 -use std::fs::{self, File}; 84 -use std::io::Read; 85 -use std::path::{Component, PathBuf, Path}; 86 -use std::process::Command; 87 - 88 -use build_helper::{run_silent, run_suppressed, output, mtime}; 89 - 90 -use util::{exe, libdir, add_lib_path}; 91 - 92 -mod cc; 93 -mod channel; 94 -mod check; 95 -mod clean; 96 -mod compile; 97 -mod metadata; 98 -mod config; 99 -mod dist; 100 -mod doc; 101 -mod flags; 102 -mod install; 103 -mod native; 104 -mod sanity; 105 -mod step; 106 -pub mod util; 107 - 108 -#[cfg(windows)] 109 -mod job; 110 - 111 -#[cfg(not(windows))] 112 -mod job { 113 - pub unsafe fn setup() {} 114 -} 115 - 116 -pub use config::Config; 117 -pub use flags::{Flags, Subcommand}; 118 - 119 -/// A structure representing a Rust compiler. 120 -/// 121 -/// Each compiler has a `stage` that it is associated with and a `host` that 122 -/// corresponds to the platform the compiler runs on. This structure is used as 123 -/// a parameter to many methods below. 124 -#[derive(Eq, PartialEq, Clone, Copy, Hash, Debug)] 125 -pub struct Compiler<'a> { 126 - stage: u32, 127 - host: &'a str, 128 -} 129 - 130 -/// Global configuration for the build system. 131 -/// 132 -/// This structure transitively contains all configuration for the build system. 133 -/// All filesystem-encoded configuration is in `config`, all flags are in 134 -/// `flags`, and then parsed or probed information is listed in the keys below. 135 -/// 136 -/// This structure is a parameter of almost all methods in the build system, 137 -/// although most functions are implemented as free functions rather than 138 -/// methods specifically on this structure itself (to make it easier to 139 -/// organize). 140 -pub struct Build { 141 - // User-specified configuration via config.toml 142 - config: Config, 143 - 144 - // User-specified configuration via CLI flags 145 - flags: Flags, 146 - 147 - // Derived properties from the above two configurations 148 - cargo: PathBuf, 149 - rustc: PathBuf, 150 - src: PathBuf, 151 - out: PathBuf, 152 - rust_info: channel::GitInfo, 153 - cargo_info: channel::GitInfo, 154 - rls_info: channel::GitInfo, 155 - local_rebuild: bool, 156 - 157 - // Probed tools at runtime 158 - lldb_version: Option<String>, 159 - lldb_python_dir: Option<String>, 160 - 161 - // Runtime state filled in later on 162 - cc: HashMap<String, (gcc::Tool, Option<PathBuf>)>, 163 - cxx: HashMap<String, gcc::Tool>, 164 - crates: HashMap<String, Crate>, 165 - is_sudo: bool, 166 - src_is_git: bool, 167 -} 168 - 169 -#[derive(Debug)] 170 -struct Crate { 171 - name: String, 172 - version: String, 173 - deps: Vec<String>, 174 - path: PathBuf, 175 - doc_step: String, 176 - build_step: String, 177 - test_step: String, 178 - bench_step: String, 179 -} 180 - 181 -/// The various "modes" of invoking Cargo. 182 -/// 183 -/// These entries currently correspond to the various output directories of the 184 -/// build system, with each mod generating output in a different directory. 185 -#[derive(Clone, Copy, PartialEq, Eq)] 186 -pub enum Mode { 187 - /// This cargo is going to build the standard library, placing output in the 188 - /// "stageN-std" directory. 189 - Libstd, 190 - 191 - /// This cargo is going to build libtest, placing output in the 192 - /// "stageN-test" directory. 193 - Libtest, 194 - 195 - /// This cargo is going to build librustc and compiler libraries, placing 196 - /// output in the "stageN-rustc" directory. 197 - Librustc, 198 - 199 - /// This cargo is going to some build tool, placing output in the 200 - /// "stageN-tools" directory. 201 - Tool, 202 -} 203 - 204 -impl Build { 205 - /// Creates a new set of build configuration from the `flags` on the command 206 - /// line and the filesystem `config`. 207 - /// 208 - /// By default all build output will be placed in the current directory. 209 - pub fn new(flags: Flags, config: Config) -> Build { 210 - let cwd = t!(env::current_dir()); 211 - let src = flags.src.clone().or_else(|| { 212 - env::var_os("SRC").map(|x| x.into()) 213 - }).unwrap_or(cwd.clone()); 214 - let out = cwd.join("build"); 215 - 216 - let stage0_root = out.join(&config.build).join("stage0/bin"); 217 - let rustc = match config.rustc { 218 - Some(ref s) => PathBuf::from(s), 219 - None => stage0_root.join(exe("rustc", &config.build)), 220 - }; 221 - let cargo = match config.cargo { 222 - Some(ref s) => PathBuf::from(s), 223 - None => stage0_root.join(exe("cargo", &config.build)), 224 - }; 225 - let local_rebuild = config.local_rebuild; 226 - 227 - let is_sudo = match env::var_os("SUDO_USER") { 228 - Some(sudo_user) => { 229 - match env::var_os("USER") { 230 - Some(user) => user != sudo_user, 231 - None => false, 232 - } 233 - } 234 - None => false, 235 - }; 236 - let rust_info = channel::GitInfo::new(&src); 237 - let cargo_info = channel::GitInfo::new(&src.join("src/tools/cargo")); 238 - let rls_info = channel::GitInfo::new(&src.join("src/tools/rls")); 239 - let src_is_git = src.join(".git").exists(); 240 - 241 - Build { 242 - flags: flags, 243 - config: config, 244 - cargo: cargo, 245 - rustc: rustc, 246 - src: src, 247 - out: out, 248 - 249 - rust_info: rust_info, 250 - cargo_info: cargo_info, 251 - rls_info: rls_info, 252 - local_rebuild: local_rebuild, 253 - cc: HashMap::new(), 254 - cxx: HashMap::new(), 255 - crates: HashMap::new(), 256 - lldb_version: None, 257 - lldb_python_dir: None, 258 - is_sudo: is_sudo, 259 - src_is_git: src_is_git, 260 - } 261 - } 262 - 263 - /// Executes the entire build, as configured by the flags and configuration. 264 - pub fn build(&mut self) { 265 - unsafe { 266 - job::setup(); 267 - } 268 - 269 - if let Subcommand::Clean = self.flags.cmd { 270 - return clean::clean(self); 271 - } 272 - 273 - self.verbose("finding compilers"); 274 - cc::find(self); 275 - self.verbose("running sanity check"); 276 - sanity::check(self); 277 - // If local-rust is the same major.minor as the current version, then force a local-rebuild 278 - let local_version_verbose = output( 279 - Command::new(&self.rustc).arg("--version").arg("--verbose")); 280 - let local_release = local_version_verbose 281 - .lines().filter(|x| x.starts_with("release:")) 282 - .next().unwrap().trim_left_matches("release:").trim(); 283 - let my_version = channel::CFG_RELEASE_NUM; 284 - if local_release.split('.').take(2).eq(my_version.split('.').take(2)) { 285 - self.verbose(&format!("auto-detected local-rebuild {}", local_release)); 286 - self.local_rebuild = true; 287 - } 288 - self.verbose("updating submodules"); 289 - self.update_submodules(); 290 - self.verbose("learning about cargo"); 291 - metadata::build(self); 292 - 293 - step::run(self); 294 - } 295 - 296 - /// Updates all git submodules that we have. 297 - /// 298 - /// This will detect if any submodules are out of date an run the necessary 299 - /// commands to sync them all with upstream. 300 - fn update_submodules(&self) { 301 - struct Submodule<'a> { 302 - path: &'a Path, 303 - state: State, 304 - } 305 - 306 - enum State { 307 - // The submodule may have staged/unstaged changes 308 - MaybeDirty, 309 - // Or could be initialized but never updated 310 - NotInitialized, 311 - // The submodule, itself, has extra commits but those changes haven't been commited to 312 - // the (outer) git repository 313 - OutOfSync, 314 - } 315 - 316 - if !self.src_is_git || !self.config.submodules { 317 - return 318 - } 319 - let git = || { 320 - let mut cmd = Command::new("git"); 321 - cmd.current_dir(&self.src); 322 - return cmd 323 - }; 324 - let git_submodule = || { 325 - let mut cmd = Command::new("git"); 326 - cmd.current_dir(&self.src).arg("submodule"); 327 - return cmd 328 - }; 329 - 330 - // FIXME: this takes a seriously long time to execute on Windows and a 331 - // nontrivial amount of time on Unix, we should have a better way 332 - // of detecting whether we need to run all the submodule commands 333 - // below. 334 - let out = output(git_submodule().arg("status")); 335 - let mut submodules = vec![]; 336 - for line in out.lines() { 337 - // NOTE `git submodule status` output looks like this: 338 - // 339 - // -5066b7dcab7e700844b0e2ba71b8af9dc627a59b src/liblibc 340 - // +b37ef24aa82d2be3a3cc0fe89bf82292f4ca181c src/compiler-rt (remotes/origin/..) 341 - // e058ca661692a8d01f8cf9d35939dfe3105ce968 src/jemalloc (3.6.0-533-ge058ca6) 342 - // 343 - // The first character can be '-', '+' or ' ' and denotes the `State` of the submodule 344 - // Right next to this character is the SHA-1 of the submodule HEAD 345 - // And after that comes the path to the submodule 346 - let path = Path::new(line[1..].split(' ').skip(1).next().unwrap()); 347 - let state = if line.starts_with('-') { 348 - State::NotInitialized 349 - } else if line.starts_with('+') { 350 - State::OutOfSync 351 - } else if line.starts_with(' ') { 352 - State::MaybeDirty 353 - } else { 354 - panic!("unexpected git submodule state: {:?}", line.chars().next()); 355 - }; 356 - 357 - submodules.push(Submodule { path: path, state: state }) 358 - } 359 - 360 - self.run(git_submodule().arg("sync")); 361 - 362 - for submodule in submodules { 363 - // If using llvm-root then don't touch the llvm submodule. 364 - if submodule.path.components().any(|c| c == Component::Normal("llvm".as_ref())) && 365 - self.config.target_config.get(&self.config.build) 366 - .and_then(|c| c.llvm_config.as_ref()).is_some() 367 - { 368 - continue 369 - } 370 - 371 - if submodule.path.components().any(|c| c == Component::Normal("jemalloc".as_ref())) && 372 - !self.config.use_jemalloc 373 - { 374 - continue 375 - } 376 - 377 - // `submodule.path` is the relative path to a submodule (from the repository root) 378 - // `submodule_path` is the path to a submodule from the cwd 379 - 380 - // use `submodule.path` when e.g. executing a submodule specific command from the 381 - // repository root 382 - // use `submodule_path` when e.g. executing a normal git command for the submodule 383 - // (set via `current_dir`) 384 - let submodule_path = self.src.join(submodule.path); 385 - 386 - match submodule.state { 387 - State::MaybeDirty => { 388 - // drop staged changes 389 - self.run(git().current_dir(&submodule_path) 390 - .args(&["reset", "--hard"])); 391 - // drops unstaged changes 392 - self.run(git().current_dir(&submodule_path) 393 - .args(&["clean", "-fdx"])); 394 - }, 395 - State::NotInitialized => { 396 - self.run(git_submodule().arg("init").arg(submodule.path)); 397 - self.run(git_submodule().arg("update").arg(submodule.path)); 398 - }, 399 - State::OutOfSync => { 400 - // drops submodule commits that weren't reported to the (outer) git repository 401 - self.run(git_submodule().arg("update").arg(submodule.path)); 402 - self.run(git().current_dir(&submodule_path) 403 - .args(&["reset", "--hard"])); 404 - self.run(git().current_dir(&submodule_path) 405 - .args(&["clean", "-fdx"])); 406 - }, 407 - } 408 - } 409 - } 410 - 411 - /// Clear out `dir` if `input` is newer. 412 - /// 413 - /// After this executes, it will also ensure that `dir` exists. 414 - fn clear_if_dirty(&self, dir: &Path, input: &Path) { 415 - let stamp = dir.join(".stamp"); 416 - if mtime(&stamp) < mtime(input) { 417 - self.verbose(&format!("Dirty - {}", dir.display())); 418 - let _ = fs::remove_dir_all(dir); 419 - } else if stamp.exists() { 420 - return 421 - } 422 - t!(fs::create_dir_all(dir)); 423 - t!(File::create(stamp)); 424 - } 425 - 426 - /// Prepares an invocation of `cargo` to be run. 427 - /// 428 - /// This will create a `Command` that represents a pending execution of 429 - /// Cargo. This cargo will be configured to use `compiler` as the actual 430 - /// rustc compiler, its output will be scoped by `mode`'s output directory, 431 - /// it will pass the `--target` flag for the specified `target`, and will be 432 - /// executing the Cargo command `cmd`. 433 - fn cargo(&self, 434 - compiler: &Compiler, 435 - mode: Mode, 436 - target: &str, 437 - cmd: &str) -> Command { 438 - let mut cargo = Command::new(&self.cargo); 439 - let out_dir = self.stage_out(compiler, mode); 440 - cargo.env("CARGO_TARGET_DIR", out_dir) 441 - .arg(cmd) 442 - .arg("-j").arg(self.jobs().to_string()) 443 - .arg("--target").arg(target); 444 - 445 - // FIXME: Temporary fix for https://github.com/rust-lang/cargo/issues/3005 446 - // Force cargo to output binaries with disambiguating hashes in the name 447 - cargo.env("__CARGO_DEFAULT_LIB_METADATA", "1"); 448 - 449 - let stage; 450 - if compiler.stage == 0 && self.local_rebuild { 451 - // Assume the local-rebuild rustc already has stage1 features. 452 - stage = 1; 453 - } else { 454 - stage = compiler.stage; 455 - } 456 - 457 - // Customize the compiler we're running. Specify the compiler to cargo 458 - // as our shim and then pass it some various options used to configure 459 - // how the actual compiler itself is called. 460 - // 461 - // These variables are primarily all read by 462 - // src/bootstrap/bin/{rustc.rs,rustdoc.rs} 463 - cargo.env("RUSTBUILD_NATIVE_DIR", self.native_dir(target)) 464 - .env("RUSTC", self.out.join("bootstrap/debug/rustc")) 465 - .env("RUSTC_REAL", self.compiler_path(compiler)) 466 - .env("RUSTC_STAGE", stage.to_string()) 467 - .env("RUSTC_CODEGEN_UNITS", 468 - self.config.rust_codegen_units.to_string()) 469 - .env("RUSTC_DEBUG_ASSERTIONS", 470 - self.config.rust_debug_assertions.to_string()) 471 - .env("RUSTC_SYSROOT", self.sysroot(compiler)) 472 - .env("RUSTC_LIBDIR", self.rustc_libdir(compiler)) 473 - .env("RUSTC_RPATH", self.config.rust_rpath.to_string()) 474 - .env("RUSTDOC", self.out.join("bootstrap/debug/rustdoc")) 475 - .env("RUSTDOC_REAL", self.rustdoc(compiler)) 476 - .env("RUSTC_FLAGS", self.rustc_flags(target).join(" ")); 477 - 478 - // Tools don't get debuginfo right now, e.g. cargo and rls don't get 479 - // compiled with debuginfo. 480 - if mode != Mode::Tool { 481 - cargo.env("RUSTC_DEBUGINFO", self.config.rust_debuginfo.to_string()) 482 - .env("RUSTC_DEBUGINFO_LINES", self.config.rust_debuginfo_lines.to_string()) 483 - .env("RUSTC_FORCE_UNSTABLE", "1"); 484 - } 485 - 486 - // Enable usage of unstable features 487 - cargo.env("RUSTC_BOOTSTRAP", "1"); 488 - self.add_rust_test_threads(&mut cargo); 489 - 490 - // Almost all of the crates that we compile as part of the bootstrap may 491 - // have a build script, including the standard library. To compile a 492 - // build script, however, it itself needs a standard library! This 493 - // introduces a bit of a pickle when we're compiling the standard 494 - // library itself. 495 - // 496 - // To work around this we actually end up using the snapshot compiler 497 - // (stage0) for compiling build scripts of the standard library itself. 498 - // The stage0 compiler is guaranteed to have a libstd available for use. 499 - // 500 - // For other crates, however, we know that we've already got a standard 501 - // library up and running, so we can use the normal compiler to compile 502 - // build scripts in that situation. 503 - if mode == Mode::Libstd { 504 - cargo.env("RUSTC_SNAPSHOT", &self.rustc) 505 - .env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_snapshot_libdir()); 506 - } else { 507 - cargo.env("RUSTC_SNAPSHOT", self.compiler_path(compiler)) 508 - .env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_libdir(compiler)); 509 - } 510 - 511 - // There are two invariants we try must maintain: 512 - // * stable crates cannot depend on unstable crates (general Rust rule), 513 - // * crates that end up in the sysroot must be unstable (rustbuild rule). 514 - // 515 - // In order to do enforce the latter, we pass the env var 516 - // `RUSTBUILD_UNSTABLE` down the line for any crates which will end up 517 - // in the sysroot. We read this in bootstrap/bin/rustc.rs and if it is 518 - // set, then we pass the `rustbuild` feature to rustc when building the 519 - // the crate. 520 - // 521 - // In turn, crates that can be used here should recognise the `rustbuild` 522 - // feature and opt-in to `rustc_private`. 523 - // 524 - // We can't always pass `rustbuild` because crates which are outside of 525 - // the comipiler, libs, and tests are stable and we don't want to make 526 - // their deps unstable (since this would break the first invariant 527 - // above). 528 - // 529 - // FIXME: remove this after next stage0 530 - if mode != Mode::Tool && stage == 0 { 531 - cargo.env("RUSTBUILD_UNSTABLE", "1"); 532 - } 533 - 534 - // Ignore incremental modes except for stage0, since we're 535 - // not guaranteeing correctness acros builds if the compiler 536 - // is changing under your feet.` 537 - if self.flags.incremental && compiler.stage == 0 { 538 - let incr_dir = self.incremental_dir(compiler); 539 - cargo.env("RUSTC_INCREMENTAL", incr_dir); 540 - } 541 - 542 - if let Some(ref on_fail) = self.flags.on_fail { 543 - cargo.env("RUSTC_ON_FAIL", on_fail); 544 - } 545 - 546 - let verbose = cmp::max(self.config.verbose, self.flags.verbose); 547 - cargo.env("RUSTC_VERBOSE", format!("{}", verbose)); 548 - 549 - // Specify some various options for build scripts used throughout 550 - // the build. 551 - // 552 - // FIXME: the guard against msvc shouldn't need to be here 553 - if !target.contains("msvc") { 554 - cargo.env(format!("CC_{}", target), self.cc(target)) 555 - .env(format!("AR_{}", target), self.ar(target).unwrap()) // only msvc is None 556 - .env(format!("CFLAGS_{}", target), self.cflags(target).join(" ")); 557 - } 558 - 559 - if self.config.extended && compiler.is_final_stage(self) { 560 - cargo.env("RUSTC_SAVE_ANALYSIS", "api".to_string()); 561 - } 562 - 563 - // When being built Cargo will at some point call `nmake.exe` on Windows 564 - // MSVC. Unfortunately `nmake` will read these two environment variables 565 - // below and try to intepret them. We're likely being run, however, from 566 - // MSYS `make` which uses the same variables. 567 - // 568 - // As a result, to prevent confusion and errors, we remove these 569 - // variables from our environment to prevent passing MSYS make flags to 570 - // nmake, causing it to blow up. 571 - if cfg!(target_env = "msvc") { 572 - cargo.env_remove("MAKE"); 573 - cargo.env_remove("MAKEFLAGS"); 574 - } 575 - 576 - // Environment variables *required* needed throughout the build 577 - // 578 - // FIXME: should update code to not require this env var 579 - cargo.env("CFG_COMPILER_HOST_TRIPLE", target); 580 - 581 - if self.config.verbose() || self.flags.verbose() { 582 - cargo.arg("-v"); 583 - } 584 - // FIXME: cargo bench does not accept `--release` 585 - if self.config.rust_optimize && cmd != "bench" { 586 - cargo.arg("--release"); 587 - } 588 - if self.config.locked_deps { 589 - cargo.arg("--locked"); 590 - } 591 - if self.config.vendor || self.is_sudo { 592 - cargo.arg("--frozen"); 593 - } 594 - return cargo 595 - } 596 - 597 - /// Get a path to the compiler specified. 598 - fn compiler_path(&self, compiler: &Compiler) -> PathBuf { 599 - if compiler.is_snapshot(self) { 600 - self.rustc.clone() 601 - } else { 602 - self.sysroot(compiler).join("bin").join(exe("rustc", compiler.host)) 603 - } 604 - } 605 - 606 - /// Get the specified tool built by the specified compiler 607 - fn tool(&self, compiler: &Compiler, tool: &str) -> PathBuf { 608 - self.cargo_out(compiler, Mode::Tool, compiler.host) 609 - .join(exe(tool, compiler.host)) 610 - } 611 - 612 - /// Get the `rustdoc` executable next to the specified compiler 613 - fn rustdoc(&self, compiler: &Compiler) -> PathBuf { 614 - let mut rustdoc = self.compiler_path(compiler); 615 - rustdoc.pop(); 616 - rustdoc.push(exe("rustdoc", compiler.host)); 617 - return rustdoc 618 - } 619 - 620 - /// Get a `Command` which is ready to run `tool` in `stage` built for 621 - /// `host`. 622 - fn tool_cmd(&self, compiler: &Compiler, tool: &str) -> Command { 623 - let mut cmd = Command::new(self.tool(&compiler, tool)); 624 - self.prepare_tool_cmd(compiler, &mut cmd); 625 - return cmd 626 - } 627 - 628 - /// Prepares the `cmd` provided to be able to run the `compiler` provided. 629 - /// 630 - /// Notably this munges the dynamic library lookup path to point to the 631 - /// right location to run `compiler`. 632 - fn prepare_tool_cmd(&self, compiler: &Compiler, cmd: &mut Command) { 633 - let host = compiler.host; 634 - let mut paths = vec![ 635 - self.sysroot_libdir(compiler, compiler.host), 636 - self.cargo_out(compiler, Mode::Tool, host).join("deps"), 637 - ]; 638 - 639 - // On MSVC a tool may invoke a C compiler (e.g. compiletest in run-make 640 - // mode) and that C compiler may need some extra PATH modification. Do 641 - // so here. 642 - if compiler.host.contains("msvc") { 643 - let curpaths = env::var_os("PATH").unwrap_or(OsString::new()); 644 - let curpaths = env::split_paths(&curpaths).collect::<Vec<_>>(); 645 - for &(ref k, ref v) in self.cc[compiler.host].0.env() { 646 - if k != "PATH" { 647 - continue 648 - } 649 - for path in env::split_paths(v) { 650 - if !curpaths.contains(&path) { 651 - paths.push(path); 652 - } 653 - } 654 - } 655 - } 656 - add_lib_path(paths, cmd); 657 - } 658 - 659 - /// Get the space-separated set of activated features for the standard 660 - /// library. 661 - fn std_features(&self) -> String { 662 - let mut features = "panic-unwind".to_string(); 663 - 664 - if self.config.debug_jemalloc { 665 - features.push_str(" debug-jemalloc"); 666 - } 667 - if self.config.use_jemalloc { 668 - features.push_str(" jemalloc"); 669 - } 670 - if self.config.backtrace { 671 - features.push_str(" backtrace"); 672 - } 673 - return features 674 - } 675 - 676 - /// Get the space-separated set of activated features for the compiler. 677 - fn rustc_features(&self) -> String { 678 - let mut features = String::new(); 679 - if self.config.use_jemalloc { 680 - features.push_str(" jemalloc"); 681 - } 682 - return features 683 - } 684 - 685 - /// Component directory that Cargo will produce output into (e.g. 686 - /// release/debug) 687 - fn cargo_dir(&self) -> &'static str { 688 - if self.config.rust_optimize {"release"} else {"debug"} 689 - } 690 - 691 - /// Returns the sysroot for the `compiler` specified that *this build system 692 - /// generates*. 693 - /// 694 - /// That is, the sysroot for the stage0 compiler is not what the compiler 695 - /// thinks it is by default, but it's the same as the default for stages 696 - /// 1-3. 697 - fn sysroot(&self, compiler: &Compiler) -> PathBuf { 698 - if compiler.stage == 0 { 699 - self.out.join(compiler.host).join("stage0-sysroot") 700 - } else { 701 - self.out.join(compiler.host).join(format!("stage{}", compiler.stage)) 702 - } 703 - } 704 - 705 - /// Get the directory for incremental by-products when using the 706 - /// given compiler. 707 - fn incremental_dir(&self, compiler: &Compiler) -> PathBuf { 708 - self.out.join(compiler.host).join(format!("stage{}-incremental", compiler.stage)) 709 - } 710 - 711 - /// Returns the libdir where the standard library and other artifacts are 712 - /// found for a compiler's sysroot. 713 - fn sysroot_libdir(&self, compiler: &Compiler, target: &str) -> PathBuf { 714 - self.sysroot(compiler).join("lib").join("rustlib") 715 - .join(target).join("lib") 716 - } 717 - 718 - /// Returns the root directory for all output generated in a particular 719 - /// stage when running with a particular host compiler. 720 - /// 721 - /// The mode indicates what the root directory is for. 722 - fn stage_out(&self, compiler: &Compiler, mode: Mode) -> PathBuf { 723 - let suffix = match mode { 724 - Mode::Libstd => "-std", 725 - Mode::Libtest => "-test", 726 - Mode::Tool => "-tools", 727 - Mode::Librustc => "-rustc", 728 - }; 729 - self.out.join(compiler.host) 730 - .join(format!("stage{}{}", compiler.stage, suffix)) 731 - } 732 - 733 - /// Returns the root output directory for all Cargo output in a given stage, 734 - /// running a particular comipler, wehther or not we're building the 735 - /// standard library, and targeting the specified architecture. 736 - fn cargo_out(&self, 737 - compiler: &Compiler, 738 - mode: Mode, 739 - target: &str) -> PathBuf { 740 - self.stage_out(compiler, mode).join(target).join(self.cargo_dir()) 741 - } 742 - 743 - /// Root output directory for LLVM compiled for `target` 744 - /// 745 - /// Note that if LLVM is configured externally then the directory returned 746 - /// will likely be empty. 747 - fn llvm_out(&self, target: &str) -> PathBuf { 748 - self.out.join(target).join("llvm") 749 - } 750 - 751 - /// Output directory for all documentation for a target 752 - fn doc_out(&self, target: &str) -> PathBuf { 753 - self.out.join(target).join("doc") 754 - } 755 - 756 - /// Output directory for all crate documentation for a target (temporary) 757 - /// 758 - /// The artifacts here are then copied into `doc_out` above. 759 - fn crate_doc_out(&self, target: &str) -> PathBuf { 760 - self.out.join(target).join("crate-docs") 761 - } 762 - 763 - /// Returns true if no custom `llvm-config` is set for the specified target. 764 - /// 765 - /// If no custom `llvm-config` was specified then Rust's llvm will be used. 766 - fn is_rust_llvm(&self, target: &str) -> bool { 767 - match self.config.target_config.get(target) { 768 - Some(ref c) => c.llvm_config.is_none(), 769 - None => true 770 - } 771 - } 772 - 773 - /// Returns the path to `llvm-config` for the specified target. 774 - /// 775 - /// If a custom `llvm-config` was specified for target then that's returned 776 - /// instead. 777 - fn llvm_config(&self, target: &str) -> PathBuf { 778 - let target_config = self.config.target_config.get(target); 779 - if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) { 780 - s.clone() 781 - } else { 782 - self.llvm_out(&self.config.build).join("bin") 783 - .join(exe("llvm-config", target)) 784 - } 785 - } 786 - 787 - /// Returns the path to `FileCheck` binary for the specified target 788 - fn llvm_filecheck(&self, target: &str) -> PathBuf { 789 - let target_config = self.config.target_config.get(target); 790 - if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) { 791 - let llvm_bindir = output(Command::new(s).arg("--bindir")); 792 - Path::new(llvm_bindir.trim()).join(exe("FileCheck", target)) 793 - } else { 794 - let base = self.llvm_out(&self.config.build).join("build"); 795 - let exe = exe("FileCheck", target); 796 - if !self.config.ninja && self.config.build.contains("msvc") { 797 - base.join("Release/bin").join(exe) 798 - } else { 799 - base.join("bin").join(exe) 800 - } 801 - } 802 - } 803 - 804 - /// Directory for libraries built from C/C++ code and shared between stages. 805 - fn native_dir(&self, target: &str) -> PathBuf { 806 - self.out.join(target).join("native") 807 - } 808 - 809 - /// Root output directory for rust_test_helpers library compiled for 810 - /// `target` 811 - fn test_helpers_out(&self, target: &str) -> PathBuf { 812 - self.native_dir(target).join("rust-test-helpers") 813 - } 814 - 815 - /// Adds the compiler's directory of dynamic libraries to `cmd`'s dynamic 816 - /// library lookup path. 817 - fn add_rustc_lib_path(&self, compiler: &Compiler, cmd: &mut Command) { 818 - // Windows doesn't need dylib path munging because the dlls for the 819 - // compiler live next to the compiler and the system will find them 820 - // automatically. 821 - if cfg!(windows) { 822 - return 823 - } 824 - 825 - add_lib_path(vec![self.rustc_libdir(compiler)], cmd); 826 - } 827 - 828 - /// Adds the `RUST_TEST_THREADS` env var if necessary 829 - fn add_rust_test_threads(&self, cmd: &mut Command) { 830 - if env::var_os("RUST_TEST_THREADS").is_none() { 831 - cmd.env("RUST_TEST_THREADS", self.jobs().to_string()); 832 - } 833 - } 834 - 835 - /// Returns the compiler's libdir where it stores the dynamic libraries that 836 - /// it itself links against. 837 - /// 838 - /// For example this returns `<sysroot>/lib` on Unix and `<sysroot>/bin` on 839 - /// Windows. 840 - fn rustc_libdir(&self, compiler: &Compiler) -> PathBuf { 841 - if compiler.is_snapshot(self) { 842 - self.rustc_snapshot_libdir() 843 - } else { 844 - self.sysroot(compiler).join(libdir(compiler.host)) 845 - } 846 - } 847 - 848 - /// Returns the libdir of the snapshot compiler. 849 - fn rustc_snapshot_libdir(&self) -> PathBuf { 850 - self.rustc.parent().unwrap().parent().unwrap() 851 - .join(libdir(&self.config.build)) 852 - } 853 - 854 - /// Runs a command, printing out nice contextual information if it fails. 855 - fn run(&self, cmd: &mut Command) { 856 - self.verbose(&format!("running: {:?}", cmd)); 857 - run_silent(cmd) 858 - } 859 - 860 - /// Runs a command, printing out nice contextual information if it fails. 861 - fn run_quiet(&self, cmd: &mut Command) { 862 - self.verbose(&format!("running: {:?}", cmd)); 863 - run_suppressed(cmd) 864 - } 865 - 866 - /// Prints a message if this build is configured in verbose mode. 867 - fn verbose(&self, msg: &str) { 868 - if self.flags.verbose() || self.config.verbose() { 869 - println!("{}", msg); 870 - } 871 - } 872 - 873 - /// Returns the number of parallel jobs that have been configured for this 874 - /// build. 875 - fn jobs(&self) -> u32 { 876 - self.flags.jobs.unwrap_or(num_cpus::get() as u32) 877 - } 878 - 879 - /// Returns the path to the C compiler for the target specified. 880 - fn cc(&self, target: &str) -> &Path { 881 - self.cc[target].0.path() 882 - } 883 - 884 - /// Returns a list of flags to pass to the C compiler for the target 885 - /// specified. 886 - fn cflags(&self, target: &str) -> Vec<String> { 887 - // Filter out -O and /O (the optimization flags) that we picked up from 888 - // gcc-rs because the build scripts will determine that for themselves. 889 - let mut base = self.cc[target].0.args().iter() 890 - .map(|s| s.to_string_lossy().into_owned()) 891 - .filter(|s| !s.starts_with("-O") && !s.starts_with("/O")) 892 - .collect::<Vec<_>>(); 893 - 894 - // If we're compiling on macOS then we add a few unconditional flags 895 - // indicating that we want libc++ (more filled out than libstdc++) and 896 - // we want to compile for 10.7. This way we can ensure that 897 - // LLVM/jemalloc/etc are all properly compiled. 898 - if target.contains("apple-darwin") { 899 - base.push("-stdlib=libc++".into()); 900 - } 901 - 902 - // Work around an apparently bad MinGW / GCC optimization, 903 - // See: http://lists.llvm.org/pipermail/cfe-dev/2016-December/051980.html 904 - // See: https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78936 905 - if target == "i686-pc-windows-gnu" { 906 - base.push("-fno-omit-frame-pointer".into()); 907 - } 908 - return base 909 - } 910 - 911 - /// Returns the path to the `ar` archive utility for the target specified. 912 - fn ar(&self, target: &str) -> Option<&Path> { 913 - self.cc[target].1.as_ref().map(|p| &**p) 914 - } 915 - 916 - /// Returns the path to the C++ compiler for the target specified, may panic 917 - /// if no C++ compiler was configured for the target. 918 - fn cxx(&self, target: &str) -> &Path { 919 - match self.cxx.get(target) { 920 - Some(p) => p.path(), 921 - None => panic!("\n\ntarget `{}` is not configured as a host, 922 - only as a target\n\n", target), 923 - } 924 - } 925 - 926 - /// Returns flags to pass to the compiler to generate code for `target`. 927 - fn rustc_flags(&self, target: &str) -> Vec<String> { 928 - // New flags should be added here with great caution! 929 - // 930 - // It's quite unfortunate to **require** flags to generate code for a 931 - // target, so it should only be passed here if absolutely necessary! 932 - // Most default configuration should be done through target specs rather 933 - // than an entry here. 934 - 935 - let mut base = Vec::new(); 936 - if target != self.config.build && !target.contains("msvc") && 937 - !target.contains("emscripten") { 938 - base.push(format!("-Clinker={}", self.cc(target).display())); 939 - } 940 - return base 941 - } 942 - 943 - /// Returns the "musl root" for this `target`, if defined 944 - fn musl_root(&self, target: &str) -> Option<&Path> { 945 - self.config.target_config.get(target) 946 - .and_then(|t| t.musl_root.as_ref()) 947 - .or(self.config.musl_root.as_ref()) 948 - .map(|p| &**p) 949 - } 950 - 951 - /// Returns whether the target will be tested using the `remote-test-client` 952 - /// and `remote-test-server` binaries. 953 - fn remote_tested(&self, target: &str) -> bool { 954 - self.qemu_rootfs(target).is_some() || target.contains("android") 955 - } 956 - 957 - /// Returns the root of the "rootfs" image that this target will be using, 958 - /// if one was configured. 959 - /// 960 - /// If `Some` is returned then that means that tests for this target are 961 - /// emulated with QEMU and binaries will need to be shipped to the emulator. 962 - fn qemu_rootfs(&self, target: &str) -> Option<&Path> { 963 - self.config.target_config.get(target) 964 - .and_then(|t| t.qemu_rootfs.as_ref()) 965 - .map(|p| &**p) 966 - } 967 - 968 - /// Path to the python interpreter to use 969 - fn python(&self) -> &Path { 970 - self.config.python.as_ref().unwrap() 971 - } 972 - 973 - /// Tests whether the `compiler` compiling for `target` should be forced to 974 - /// use a stage1 compiler instead. 975 - /// 976 - /// Currently, by default, the build system does not perform a "full 977 - /// bootstrap" by default where we compile the compiler three times. 978 - /// Instead, we compile the compiler two times. The final stage (stage2) 979 - /// just copies the libraries from the previous stage, which is what this 980 - /// method detects. 981 - /// 982 - /// Here we return `true` if: 983 - /// 984 - /// * The build isn't performing a full bootstrap 985 - /// * The `compiler` is in the final stage, 2 986 - /// * We're not cross-compiling, so the artifacts are already available in 987 - /// stage1 988 - /// 989 - /// When all of these conditions are met the build will lift artifacts from 990 - /// the previous stage forward. 991 - fn force_use_stage1(&self, compiler: &Compiler, target: &str) -> bool { 992 - !self.config.full_bootstrap && 993 - compiler.stage >= 2 && 994 - self.config.host.iter().any(|h| h == target) 995 - } 996 - 997 - /// Returns the directory that OpenSSL artifacts are compiled into if 998 - /// configured to do so. 999 - fn openssl_dir(&self, target: &str) -> Option<PathBuf> { 1000 - // OpenSSL not used on Windows 1001 - if target.contains("windows") { 1002 - None 1003 - } else if self.config.openssl_static { 1004 - Some(self.out.join(target).join("openssl")) 1005 - } else { 1006 - None 1007 - } 1008 - } 1009 - 1010 - /// Returns the directory that OpenSSL artifacts are installed into if 1011 - /// configured as such. 1012 - fn openssl_install_dir(&self, target: &str) -> Option<PathBuf> { 1013 - self.openssl_dir(target).map(|p| p.join("install")) 1014 - } 1015 - 1016 - /// Given `num` in the form "a.b.c" return a "release string" which 1017 - /// describes the release version number. 1018 - /// 1019 - /// For example on nightly this returns "a.b.c-nightly", on beta it returns 1020 - /// "a.b.c-beta.1" and on stable it just returns "a.b.c". 1021 - fn release(&self, num: &str) -> String { 1022 - match &self.config.channel[..] { 1023 - "stable" => num.to_string(), 1024 - "beta" => format!("{}-beta{}", num, channel::CFG_PRERELEASE_VERSION), 1025 - "nightly" => format!("{}-nightly", num), 1026 - _ => format!("{}-dev", num), 1027 - } 1028 - } 1029 - 1030 - /// Returns the value of `release` above for Rust itself. 1031 - fn rust_release(&self) -> String { 1032 - self.release(channel::CFG_RELEASE_NUM) 1033 - } 1034 - 1035 - /// Returns the "package version" for a component given the `num` release 1036 - /// number. 1037 - /// 1038 - /// The package version is typically what shows up in the names of tarballs. 1039 - /// For channels like beta/nightly it's just the channel name, otherwise 1040 - /// it's the `num` provided. 1041 - fn package_vers(&self, num: &str) -> String { 1042 - match &self.config.channel[..] { 1043 - "stable" => num.to_string(), 1044 - "beta" => "beta".to_string(), 1045 - "nightly" => "nightly".to_string(), 1046 - _ => format!("{}-dev", num), 1047 - } 1048 - } 1049 - 1050 - /// Returns the value of `package_vers` above for Rust itself. 1051 - fn rust_package_vers(&self) -> String { 1052 - self.package_vers(channel::CFG_RELEASE_NUM) 1053 - } 1054 - 1055 - /// Returns the value of `package_vers` above for Cargo 1056 - fn cargo_package_vers(&self) -> String { 1057 - self.package_vers(&self.release_num("cargo")) 1058 - } 1059 - 1060 - /// Returns the value of `package_vers` above for rls 1061 - fn rls_package_vers(&self) -> String { 1062 - self.package_vers(&self.release_num("rls")) 1063 - } 1064 - 1065 - /// Returns the `version` string associated with this compiler for Rust 1066 - /// itself. 1067 - /// 1068 - /// Note that this is a descriptive string which includes the commit date, 1069 - /// sha, version, etc. 1070 - fn rust_version(&self) -> String { 1071 - self.rust_info.version(self, channel::CFG_RELEASE_NUM) 1072 - } 1073 - 1074 - /// Returns the `a.b.c` version that the given package is at. 1075 - fn release_num(&self, package: &str) -> String { 1076 - let mut toml = String::new(); 1077 - let toml_file_name = self.src.join(&format!("src/tools/{}/Cargo.toml", package)); 1078 - t!(t!(File::open(toml_file_name)).read_to_string(&mut toml)); 1079 - for line in toml.lines() { 1080 - let prefix = "version = \""; 1081 - let suffix = "\""; 1082 - if line.starts_with(prefix) && line.ends_with(suffix) { 1083 - return line[prefix.len()..line.len() - suffix.len()].to_string() 1084 - } 1085 - } 1086 - 1087 - panic!("failed to find version in {}'s Cargo.toml", package) 1088 - } 1089 - 1090 - /// Returns whether unstable features should be enabled for the compiler 1091 - /// we're building. 1092 - fn unstable_features(&self) -> bool { 1093 - match &self.config.channel[..] { 1094 - "stable" | "beta" => false, 1095 - "nightly" | _ => true, 1096 - } 1097 - } 1098 -} 1099 - 1100 -impl<'a> Compiler<'a> { 1101 - /// Creates a new complier for the specified stage/host 1102 - fn new(stage: u32, host: &'a str) -> Compiler<'a> { 1103 - Compiler { stage: stage, host: host } 1104 - } 1105 - 1106 - /// Returns whether this is a snapshot compiler for `build`'s configuration 1107 - fn is_snapshot(&self, build: &Build) -> bool { 1108 - self.stage == 0 && self.host == build.config.build 1109 - } 1110 - 1111 - /// Returns if this compiler should be treated as a final stage one in the 1112 - /// current build session. 1113 - /// This takes into account whether we're performing a full bootstrap or 1114 - /// not; don't directly compare the stage with `2`! 1115 - fn is_final_stage(&self, build: &Build) -> bool { 1116 - let final_stage = if build.config.full_bootstrap { 2 } else { 1 }; 1117 - self.stage >= final_stage 1118 - } 1119 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/metadata.rs version [9fb5111a4c].
1 -// Copyright 2016 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -use std::collections::HashMap; 12 -use std::process::Command; 13 -use std::path::PathBuf; 14 - 15 -use build_helper::output; 16 -use rustc_serialize::json; 17 - 18 -use {Build, Crate}; 19 - 20 -#[derive(RustcDecodable)] 21 -struct Output { 22 - packages: Vec<Package>, 23 - resolve: Resolve, 24 -} 25 - 26 -#[derive(RustcDecodable)] 27 -struct Package { 28 - id: String, 29 - name: String, 30 - version: String, 31 - source: Option<String>, 32 - manifest_path: String, 33 -} 34 - 35 -#[derive(RustcDecodable)] 36 -struct Resolve { 37 - nodes: Vec<ResolveNode>, 38 -} 39 - 40 -#[derive(RustcDecodable)] 41 -struct ResolveNode { 42 - id: String, 43 - dependencies: Vec<String>, 44 -} 45 - 46 -pub fn build(build: &mut Build) { 47 - build_krate(build, "src/libstd"); 48 - build_krate(build, "src/libtest"); 49 - build_krate(build, "src/rustc"); 50 -} 51 - 52 -fn build_krate(build: &mut Build, krate: &str) { 53 - // Run `cargo metadata` to figure out what crates we're testing. 54 - // 55 - // Down below we're going to call `cargo test`, but to test the right set 56 - // of packages we're going to have to know what `-p` arguments to pass it 57 - // to know what crates to test. Here we run `cargo metadata` to learn about 58 - // the dependency graph and what `-p` arguments there are. 59 - let mut cargo = Command::new(&build.cargo); 60 - cargo.arg("metadata") 61 - .arg("--format-version").arg("1") 62 - .arg("--manifest-path").arg(build.src.join(krate).join("Cargo.toml")); 63 - let output = output(&mut cargo); 64 - let output: Output = json::decode(&output).unwrap(); 65 - let mut id2name = HashMap::new(); 66 - for package in output.packages { 67 - if package.source.is_none() { 68 - id2name.insert(package.id, package.name.clone()); 69 - let mut path = PathBuf::from(package.manifest_path); 70 - path.pop(); 71 - build.crates.insert(package.name.clone(), Crate { 72 - build_step: format!("build-crate-{}", package.name), 73 - doc_step: format!("doc-crate-{}", package.name), 74 - test_step: format!("test-crate-{}", package.name), 75 - bench_step: format!("bench-crate-{}", package.name), 76 - name: package.name, 77 - version: package.version, 78 - deps: Vec::new(), 79 - path: path, 80 - }); 81 - } 82 - } 83 - 84 - for node in output.resolve.nodes { 85 - let name = match id2name.get(&node.id) { 86 - Some(name) => name, 87 - None => continue, 88 - }; 89 - 90 - let krate = build.crates.get_mut(name).unwrap(); 91 - for dep in node.dependencies.iter() { 92 - let dep = match id2name.get(dep) { 93 - Some(dep) => dep, 94 - None => continue, 95 - }; 96 - krate.deps.push(dep.clone()); 97 - } 98 - } 99 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/mk/Makefile.in version [7f17834006].
1 -# Copyright 2016 The Rust Project Developers. See the COPYRIGHT 2 -# file at the top-level directory of this distribution and at 3 -# http://rust-lang.org/COPYRIGHT. 4 -# 5 -# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -# option. This file may not be copied, modified, or distributed 9 -# except according to those terms. 10 - 11 -include config.mk 12 - 13 -ifdef VERBOSE 14 -Q := 15 -BOOTSTRAP_ARGS := -v 16 -else 17 -Q := @ 18 -BOOTSTRAP_ARGS := 19 -endif 20 - 21 -BOOTSTRAP := $(CFG_PYTHON) $(CFG_SRC_DIR)src/bootstrap/bootstrap.py 22 - 23 -all: 24 - $(Q)$(BOOTSTRAP) build $(BOOTSTRAP_ARGS) 25 - $(Q)$(BOOTSTRAP) doc $(BOOTSTRAP_ARGS) 26 - 27 -help: 28 - $(Q)echo 'Welcome to the rustbuild build system!' 29 - $(Q)echo 30 - $(Q)echo This makefile is a thin veneer over the ./x.py script located 31 - $(Q)echo in this directory. To get the full power of the build system 32 - $(Q)echo you can run x.py directly. 33 - $(Q)echo 34 - $(Q)echo To learn more run \`./x.py --help\` 35 - 36 -clean: 37 - $(Q)$(BOOTSTRAP) clean $(BOOTSTRAP_ARGS) 38 - 39 -rustc-stage1: 40 - $(Q)$(BOOTSTRAP) build --stage 1 src/libtest $(BOOTSTRAP_ARGS) 41 -rustc-stage2: 42 - $(Q)$(BOOTSTRAP) build --stage 2 src/libtest $(BOOTSTRAP_ARGS) 43 - 44 -docs: doc 45 -doc: 46 - $(Q)$(BOOTSTRAP) doc $(BOOTSTRAP_ARGS) 47 -nomicon: 48 - $(Q)$(BOOTSTRAP) doc src/doc/nomicon $(BOOTSTRAP_ARGS) 49 -book: 50 - $(Q)$(BOOTSTRAP) doc src/doc/book $(BOOTSTRAP_ARGS) 51 -standalone-docs: 52 - $(Q)$(BOOTSTRAP) doc src/doc $(BOOTSTRAP_ARGS) 53 -check: 54 - $(Q)$(BOOTSTRAP) test $(BOOTSTRAP_ARGS) 55 -check-aux: 56 - $(Q)$(BOOTSTRAP) test \ 57 - src/tools/cargotest \ 58 - cargo \ 59 - src/test/pretty \ 60 - src/test/run-pass/pretty \ 61 - src/test/run-fail/pretty \ 62 - src/test/run-pass-valgrind/pretty \ 63 - src/test/run-pass-fulldeps/pretty \ 64 - src/test/run-fail-fulldeps/pretty \ 65 - $(BOOTSTRAP_ARGS) 66 -dist: 67 - $(Q)$(BOOTSTRAP) dist $(BOOTSTRAP_ARGS) 68 -distcheck: 69 - $(Q)$(BOOTSTRAP) dist $(BOOTSTRAP_ARGS) 70 - $(Q)$(BOOTSTRAP) test distcheck $(BOOTSTRAP_ARGS) 71 -install: 72 - $(Q)$(BOOTSTRAP) dist --install $(BOOTSTRAP_ARGS) 73 -tidy: 74 - $(Q)$(BOOTSTRAP) test src/tools/tidy $(BOOTSTRAP_ARGS) 75 -prepare: 76 - $(Q)$(BOOTSTRAP) build nonexistent/path/to/trigger/cargo/metadata 77 - 78 -check-stage2-T-arm-linux-androideabi-H-x86_64-unknown-linux-gnu: 79 - $(Q)$(BOOTSTRAP) test --target arm-linux-androideabi 80 -check-stage2-T-x86_64-unknown-linux-musl-H-x86_64-unknown-linux-gnu: 81 - $(Q)$(BOOTSTRAP) test --target x86_64-unknown-linux-musl 82 - 83 - 84 -.PHONY: dist
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/native.rs version [bf55423342].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Compilation of native dependencies like LLVM. 12 -//! 13 -//! Native projects like LLVM unfortunately aren't suited just yet for 14 -//! compilation in build scripts that Cargo has. This is because thie 15 -//! compilation takes a *very* long time but also because we don't want to 16 -//! compile LLVM 3 times as part of a normal bootstrap (we want it cached). 17 -//! 18 -//! LLVM and compiler-rt are essentially just wired up to everything else to 19 -//! ensure that they're always in place if needed. 20 - 21 -use std::env; 22 -use std::ffi::OsString; 23 -use std::fs::{self, File}; 24 -use std::io::{Read, Write}; 25 -use std::path::Path; 26 -use std::process::Command; 27 - 28 -use build_helper::output; 29 -use cmake; 30 -use gcc; 31 - 32 -use Build; 33 -use util; 34 -use build_helper::up_to_date; 35 - 36 -/// Compile LLVM for `target`. 37 -pub fn llvm(build: &Build, target: &str) { 38 - // If we're using a custom LLVM bail out here, but we can only use a 39 - // custom LLVM for the build triple. 40 - if let Some(config) = build.config.target_config.get(target) { 41 - if let Some(ref s) = config.llvm_config { 42 - return check_llvm_version(build, s); 43 - } 44 - } 45 - 46 - let rebuild_trigger = build.src.join("src/rustllvm/llvm-rebuild-trigger"); 47 - let mut rebuild_trigger_contents = String::new(); 48 - t!(t!(File::open(&rebuild_trigger)).read_to_string(&mut rebuild_trigger_contents)); 49 - 50 - let out_dir = build.llvm_out(target); 51 - let done_stamp = out_dir.join("llvm-finished-building"); 52 - if done_stamp.exists() { 53 - let mut done_contents = String::new(); 54 - t!(t!(File::open(&done_stamp)).read_to_string(&mut done_contents)); 55 - 56 - // If LLVM was already built previously and contents of the rebuild-trigger file 57 - // didn't change from the previous build, then no action is required. 58 - if done_contents == rebuild_trigger_contents { 59 - return 60 - } 61 - } 62 - if build.config.llvm_clean_rebuild { 63 - drop(fs::remove_dir_all(&out_dir)); 64 - } 65 - 66 - println!("Building LLVM for {}", target); 67 - let _time = util::timeit(); 68 - t!(fs::create_dir_all(&out_dir)); 69 - 70 - // http://llvm.org/docs/CMake.html 71 - let mut cfg = cmake::Config::new(build.src.join("src/llvm")); 72 - if build.config.ninja { 73 - cfg.generator("Ninja"); 74 - } 75 - 76 - let profile = match (build.config.llvm_optimize, build.config.llvm_release_debuginfo) { 77 - (false, _) => "Debug", 78 - (true, false) => "Release", 79 - (true, true) => "RelWithDebInfo", 80 - }; 81 - 82 - // NOTE: remember to also update `config.toml.example` when changing the defaults! 83 - let llvm_targets = match build.config.llvm_targets { 84 - Some(ref s) => s, 85 - None => "X86;ARM;AArch64;Mips;PowerPC;SystemZ;JSBackend;MSP430;Sparc;NVPTX;Hexagon", 86 - }; 87 - 88 - let assertions = if build.config.llvm_assertions {"ON"} else {"OFF"}; 89 - 90 - cfg.target(target) 91 - .host(&build.config.build) 92 - .out_dir(&out_dir) 93 - .profile(profile) 94 - .define("LLVM_ENABLE_ASSERTIONS", assertions) 95 - .define("LLVM_TARGETS_TO_BUILD", llvm_targets) 96 - .define("LLVM_INCLUDE_EXAMPLES", "OFF") 97 - .define("LLVM_INCLUDE_TESTS", "OFF") 98 - .define("LLVM_INCLUDE_DOCS", "OFF") 99 - .define("LLVM_ENABLE_ZLIB", "OFF") 100 - .define("WITH_POLLY", "OFF") 101 - .define("LLVM_ENABLE_TERMINFO", "OFF") 102 - .define("LLVM_ENABLE_LIBEDIT", "OFF") 103 - .define("LLVM_PARALLEL_COMPILE_JOBS", build.jobs().to_string()) 104 - .define("LLVM_TARGET_ARCH", target.split('-').next().unwrap()) 105 - .define("LLVM_DEFAULT_TARGET_TRIPLE", target); 106 - 107 - if target.contains("msvc") { 108 - cfg.define("LLVM_USE_CRT_DEBUG", "MT"); 109 - cfg.define("LLVM_USE_CRT_RELEASE", "MT"); 110 - cfg.define("LLVM_USE_CRT_RELWITHDEBINFO", "MT"); 111 - cfg.static_crt(true); 112 - } 113 - 114 - if target.starts_with("i686") { 115 - cfg.define("LLVM_BUILD_32_BITS", "ON"); 116 - } 117 - 118 - if let Some(num_linkers) = build.config.llvm_link_jobs { 119 - if num_linkers > 0 { 120 - cfg.define("LLVM_PARALLEL_LINK_JOBS", num_linkers.to_string()); 121 - } 122 - } 123 - 124 - // http://llvm.org/docs/HowToCrossCompileLLVM.html 125 - if target != build.config.build { 126 - // FIXME: if the llvm root for the build triple is overridden then we 127 - // should use llvm-tblgen from there, also should verify that it 128 - // actually exists most of the time in normal installs of LLVM. 129 - let host = build.llvm_out(&build.config.build).join("bin/llvm-tblgen"); 130 - cfg.define("CMAKE_CROSSCOMPILING", "True") 131 - .define("LLVM_TABLEGEN", &host); 132 - } 133 - 134 - let sanitize_cc = |cc: &Path| { 135 - if target.contains("msvc") { 136 - OsString::from(cc.to_str().unwrap().replace("\\", "/")) 137 - } else { 138 - cc.as_os_str().to_owned() 139 - } 140 - }; 141 - 142 - let configure_compilers = |cfg: &mut cmake::Config| { 143 - // MSVC with CMake uses msbuild by default which doesn't respect these 144 - // vars that we'd otherwise configure. In that case we just skip this 145 - // entirely. 146 - if target.contains("msvc") && !build.config.ninja { 147 - return 148 - } 149 - 150 - let cc = build.cc(target); 151 - let cxx = build.cxx(target); 152 - 153 - // Handle msvc + ninja + ccache specially (this is what the bots use) 154 - if target.contains("msvc") && 155 - build.config.ninja && 156 - build.config.ccache.is_some() { 157 - let mut cc = env::current_exe().expect("failed to get cwd"); 158 - cc.set_file_name("sccache-plus-cl.exe"); 159 - 160 - cfg.define("CMAKE_C_COMPILER", sanitize_cc(&cc)) 161 - .define("CMAKE_CXX_COMPILER", sanitize_cc(&cc)); 162 - cfg.env("SCCACHE_PATH", 163 - build.config.ccache.as_ref().unwrap()) 164 - .env("SCCACHE_TARGET", target); 165 - 166 - // If ccache is configured we inform the build a little differently hwo 167 - // to invoke ccache while also invoking our compilers. 168 - } else if let Some(ref ccache) = build.config.ccache { 169 - cfg.define("CMAKE_C_COMPILER", ccache) 170 - .define("CMAKE_C_COMPILER_ARG1", sanitize_cc(cc)) 171 - .define("CMAKE_CXX_COMPILER", ccache) 172 - .define("CMAKE_CXX_COMPILER_ARG1", sanitize_cc(cxx)); 173 - } else { 174 - cfg.define("CMAKE_C_COMPILER", sanitize_cc(cc)) 175 - .define("CMAKE_CXX_COMPILER", sanitize_cc(cxx)); 176 - } 177 - 178 - cfg.build_arg("-j").build_arg(build.jobs().to_string()); 179 - cfg.define("CMAKE_C_FLAGS", build.cflags(target).join(" ")); 180 - cfg.define("CMAKE_CXX_FLAGS", build.cflags(target).join(" ")); 181 - }; 182 - 183 - configure_compilers(&mut cfg); 184 - 185 - if env::var_os("SCCACHE_ERROR_LOG").is_some() { 186 - cfg.env("RUST_LOG", "sccache=info"); 187 - } 188 - 189 - // FIXME: we don't actually need to build all LLVM tools and all LLVM 190 - // libraries here, e.g. we just want a few components and a few 191 - // tools. Figure out how to filter them down and only build the right 192 - // tools and libs on all platforms. 193 - cfg.build(); 194 - 195 - t!(t!(File::create(&done_stamp)).write_all(rebuild_trigger_contents.as_bytes())); 196 -} 197 - 198 -fn check_llvm_version(build: &Build, llvm_config: &Path) { 199 - if !build.config.llvm_version_check { 200 - return 201 - } 202 - 203 - let mut cmd = Command::new(llvm_config); 204 - let version = output(cmd.arg("--version")); 205 - if version.starts_with("3.5") || version.starts_with("3.6") || 206 - version.starts_with("3.7") { 207 - return 208 - } 209 - panic!("\n\nbad LLVM version: {}, need >=3.5\n\n", version) 210 -} 211 - 212 -/// Compiles the `rust_test_helpers.c` library which we used in various 213 -/// `run-pass` test suites for ABI testing. 214 -pub fn test_helpers(build: &Build, target: &str) { 215 - let dst = build.test_helpers_out(target); 216 - let src = build.src.join("src/rt/rust_test_helpers.c"); 217 - if up_to_date(&src, &dst.join("librust_test_helpers.a")) { 218 - return 219 - } 220 - 221 - println!("Building test helpers"); 222 - t!(fs::create_dir_all(&dst)); 223 - let mut cfg = gcc::Config::new(); 224 - 225 - // We may have found various cross-compilers a little differently due to our 226 - // extra configuration, so inform gcc of these compilers. Note, though, that 227 - // on MSVC we still need gcc's detection of env vars (ugh). 228 - if !target.contains("msvc") { 229 - if let Some(ar) = build.ar(target) { 230 - cfg.archiver(ar); 231 - } 232 - cfg.compiler(build.cc(target)); 233 - } 234 - 235 - cfg.cargo_metadata(false) 236 - .out_dir(&dst) 237 - .target(target) 238 - .host(&build.config.build) 239 - .opt_level(0) 240 - .debug(false) 241 - .file(build.src.join("src/rt/rust_test_helpers.c")) 242 - .compile("librust_test_helpers.a"); 243 -} 244 -const OPENSSL_VERS: &'static str = "1.0.2k"; 245 -const OPENSSL_SHA256: &'static str = 246 - "6b3977c61f2aedf0f96367dcfb5c6e578cf37e7b8d913b4ecb6643c3cb88d8c0"; 247 - 248 -pub fn openssl(build: &Build, target: &str) { 249 - let out = match build.openssl_dir(target) { 250 - Some(dir) => dir, 251 - None => return, 252 - }; 253 - 254 - let stamp = out.join(".stamp"); 255 - let mut contents = String::new(); 256 - drop(File::open(&stamp).and_then(|mut f| f.read_to_string(&mut contents))); 257 - if contents == OPENSSL_VERS { 258 - return 259 - } 260 - t!(fs::create_dir_all(&out)); 261 - 262 - let name = format!("openssl-{}.tar.gz", OPENSSL_VERS); 263 - let tarball = out.join(&name); 264 - if !tarball.exists() { 265 - let tmp = tarball.with_extension("tmp"); 266 - // originally from https://www.openssl.org/source/... 267 - let url = format!("https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/{}", 268 - name); 269 - let mut ok = false; 270 - for _ in 0..3 { 271 - let status = Command::new("curl") 272 - .arg("-o").arg(&tmp) 273 - .arg(&url) 274 - .status() 275 - .expect("failed to spawn curl"); 276 - if status.success() { 277 - ok = true; 278 - break 279 - } 280 - } 281 - if !ok { 282 - panic!("failed to download openssl source") 283 - } 284 - let mut shasum = if target.contains("apple") { 285 - let mut cmd = Command::new("shasum"); 286 - cmd.arg("-a").arg("256"); 287 - cmd 288 - } else { 289 - Command::new("sha256sum") 290 - }; 291 - let output = output(&mut shasum.arg(&tmp)); 292 - let found = output.split_whitespace().next().unwrap(); 293 - if found != OPENSSL_SHA256 { 294 - panic!("downloaded openssl sha256 different\n\ 295 - expected: {}\n\ 296 - found: {}\n", OPENSSL_SHA256, found); 297 - } 298 - t!(fs::rename(&tmp, &tarball)); 299 - } 300 - let obj = out.join(format!("openssl-{}", OPENSSL_VERS)); 301 - let dst = build.openssl_install_dir(target).unwrap(); 302 - drop(fs::remove_dir_all(&obj)); 303 - drop(fs::remove_dir_all(&dst)); 304 - build.run(Command::new("tar").arg("xf").arg(&tarball).current_dir(&out)); 305 - 306 - let mut configure = Command::new(obj.join("Configure")); 307 - configure.arg(format!("--prefix={}", dst.display())); 308 - configure.arg("no-dso"); 309 - configure.arg("no-ssl2"); 310 - configure.arg("no-ssl3"); 311 - 312 - let os = match target { 313 - "aarch64-linux-android" => "linux-aarch64", 314 - "aarch64-unknown-linux-gnu" => "linux-aarch64", 315 - "arm-linux-androideabi" => "android", 316 - "arm-unknown-linux-gnueabi" => "linux-armv4", 317 - "arm-unknown-linux-gnueabihf" => "linux-armv4", 318 - "armv7-linux-androideabi" => "android-armv7", 319 - "armv7-unknown-linux-gnueabihf" => "linux-armv4", 320 - "i686-apple-darwin" => "darwin-i386-cc", 321 - "i686-linux-android" => "android-x86", 322 - "i686-unknown-freebsd" => "BSD-x86-elf", 323 - "i686-unknown-linux-gnu" => "linux-elf", 324 - "i686-unknown-linux-musl" => "linux-elf", 325 - "mips-unknown-linux-gnu" => "linux-mips32", 326 - "mips64-unknown-linux-gnuabi64" => "linux64-mips64", 327 - "mips64el-unknown-linux-gnuabi64" => "linux64-mips64", 328 - "mipsel-unknown-linux-gnu" => "linux-mips32", 329 - "powerpc-unknown-linux-gnu" => "linux-ppc", 330 - "powerpc64-unknown-linux-gnu" => "linux-ppc64", 331 - "powerpc64le-unknown-linux-gnu" => "linux-ppc64le", 332 - "s390x-unknown-linux-gnu" => "linux64-s390x", 333 - "x86_64-apple-darwin" => "darwin64-x86_64-cc", 334 - "x86_64-linux-android" => "linux-x86_64", 335 - "x86_64-unknown-freebsd" => "BSD-x86_64", 336 - "x86_64-unknown-linux-gnu" => "linux-x86_64", 337 - "x86_64-unknown-linux-musl" => "linux-x86_64", 338 - "x86_64-unknown-netbsd" => "BSD-x86_64", 339 - _ => panic!("don't know how to configure OpenSSL for {}", target), 340 - }; 341 - configure.arg(os); 342 - configure.env("CC", build.cc(target)); 343 - for flag in build.cflags(target) { 344 - configure.arg(flag); 345 - } 346 - // There is no specific os target for android aarch64 or x86_64, 347 - // so we need to pass some extra cflags 348 - if target == "aarch64-linux-android" || target == "x86_64-linux-android" { 349 - configure.arg("-mandroid"); 350 - configure.arg("-fomit-frame-pointer"); 351 - } 352 - // Make PIE binaries 353 - // Non-PIE linker support was removed in Lollipop 354 - // https://source.android.com/security/enhancements/enhancements50 355 - if target == "i686-linux-android" { 356 - configure.arg("no-asm"); 357 - } 358 - configure.current_dir(&obj); 359 - println!("Configuring openssl for {}", target); 360 - build.run_quiet(&mut configure); 361 - println!("Building openssl for {}", target); 362 - build.run_quiet(Command::new("make").arg("-j1").current_dir(&obj)); 363 - println!("Installing openssl for {}", target); 364 - build.run_quiet(Command::new("make").arg("install").current_dir(&obj)); 365 - 366 - let mut f = t!(File::create(&stamp)); 367 - t!(f.write_all(OPENSSL_VERS.as_bytes())); 368 -}
Deleted wiki_references/2017/software/Rust/src_from_GitHub/the_repository_clones/rust/src/bootstrap/sanity.rs version [d360e5acd1].
1 -// Copyright 2015 The Rust Project Developers. See the COPYRIGHT 2 -// file at the top-level directory of this distribution and at 3 -// http://rust-lang.org/COPYRIGHT. 4 -// 5 -// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or 6 -// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license 7 -// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your 8 -// option. This file may not be copied, modified, or distributed 9 -// except according to those terms. 10 - 11 -//! Sanity checking performed by rustbuild before actually executing anything. 12 -//! 13 -//! This module contains the implementation of ensuring that the build 14 -//! environment looks reasonable before progressing. This will verify that 15 -//! various programs like git and python exist, along with ensuring that all C 16 -//! compilers for cross-compiling are found. 17 -//! 18 -//! In theory if we get past this phase it's a bug if a build fails, but in 19 -//! practice that's likely not true! 20 - 21 -use std::collections::HashSet; 22 -use std::env; 23 -use std::ffi::{OsStr, OsString}; 24 -use std::fs; 25 -use std::process::Command; 26 - 27 -use build_helper::output; 28 - 29 -use Build; 30 - 31 -pub fn check(build: &mut Build) { 32 - let mut checked = HashSet::new(); 33 - let path = env::var_os("PATH").unwrap_or(OsString::new()); 34 - // On Windows, quotes are invalid characters for filename paths, and if 35 - // one is present as part of the PATH then that can lead to the system 36 - // being unable to identify the files properly. See 37 - // https://github.com/rust-lang/rust/issues/34959 for more details. 38 - if cfg!(windows) { 39 - if path.to_string_lossy().contains("\"") { 40 - panic!("PATH contains invalid character '\"'"); 41 - } 42 - } 43 - let have_cmd = |cmd: &OsStr| { 44 - for path in env::split_paths(&path) { 45 - let target = path.join(cmd); 46 - let mut cmd_alt = cmd.to_os_string(); 47 - cmd_alt.push(".exe"); 48 - if target.is_file() || 49 - target.with_extension("exe").exists() || 50 - target.join(cmd_alt).exists() { 51 - return Some(target); 52 - } 53 - } 54 - return None; 55 - }; 56 - 57 - let mut need_cmd = |cmd: &OsStr| { 58 - if !checked.insert(cmd.to_owned()) { 59 - return 60 - } 61 - if have_cmd(cmd).is_none() { 62 - panic!("\n\ncouldn't find required command: {:?}\n\n", cmd); 63 - } 64 - }; 65 - 66 - // If we've got a git directory we're gona need git to update 67 - // submodules and learn about various other aspects. 68 - if build.src_is_git { 69 - need_cmd("git".as_ref()); 70 - } 71 - 72 - // We need cmake, but only if we're actually building LLVM or sanitizers. 73 - let building_llvm = build.config.host.iter() 74 - .filter_map(|host| build.config.target_config.get(host)) 75 - .any(|config| config.llvm_config.is_none()); 76 - if building_llvm || build.config.sanitizers { 77 - need_cmd("cmake".as_ref()); 78 - } 79 - 80 - // Ninja is currently only used for LLVM itself. 81 - if building_llvm && build.config.ninja { 82 - // Some Linux distros rename `ninja` to `ninja-build`. 83