I don't think this is correct. The camera is applying a PSF (point spread function) to the image. If you have a known target image you can find the inverse of the PSF and apply that as a correction (deconvolution).
Spherical aberration is caused by the focus deviating across the image, there being only one point in the centre where the image can truly be in focus. So although the image can be improved in the way you describe, you can’t recover the information that is out of focus.
There's a bit of literature out there claiming the opposite. Certainly if you can take multiple Z images you can correct it that way, but there are methods which work directly to de-blur (aka, in focus) an image.
I know next to nothing about how sensors are manufactured, but we do have the technology to build flexible displays. So what are the major challenges in building "flexible" (for lack of a better word, obviously they would be fixed in place) sensors?
The spherical lense creates a sphere of field of focus. The best way to correct for this is to have a spherical sensor.
Canon and Sony have indeed been working on curved sensor manufacturing, but it seems like a moonshot.