Then how can you get into the situation that the shipped binary packages can't be built with the shipped toolchain or shipped dependencies - isn't that what the build daemons do, use the system to build itself? I mean if they use the previous release or something that's fine too. Whichever way they do it, surely it's documented internally, because it's the kind of thing developers would need to know when developing. I can't believe it would be a case of "if the build doesn't build ssh into the build daemons and update them until it does".
I mean a developer working at Canonical, trying to make a change to one of the packages they're preparing for distribution, has to have some way to answer the question "what version of gcc is being used to build this package". Whether that information is kept in the source tree, on their wiki, or on a post-it on the employee fridge - in any case, it's part of the source in the preferred form for making modifications, because it's, well, part of what the employees use to make modifications.
Then how can you get into the situation that the shipped binary packages can't be built with the shipped toolchain or shipped dependencies
I think that Matthew might be suggesting that not all packages are rebuilt with every release? A binary package might be compiled with GCC-4, then the distribution is upgraded to GCC-5, dropping GCC-4. If for some reason the package is not compatible with the new compiler, you would then be in a situation where the binary cannot be recreated from source without outside tools. Do you know if Canonical recompiles everything on each update to the toolchain?
I have no specific knowledge about Canonical's processes. I'm just amazed that a serious software company in 2016 wouldn't have a reproducible build process for their primary product.