Again, the law isn’t enforced by robots and is able to adapt such that “clever legal hacks” don’t typically work. Us programming nerds tend to think in terms of rigid, unambiguous rules that treat inputs as black boxes, but the law does not work like this.
If the AI could be shown to have copied the code it would likely to be found to be infringement.
If it was found to have generated new unique code, and merely leant how to program from the code it was trained on it likely wouldn't.
In either case, this is different to a clean-room implementation (which I think is what you said by "white room").
Clean-room implementations are supposed to protect against trade secret infringement, and are mostly used when building interop with hardware (where compatibility has special carve-outs).
If a person or AI had seen copyright code used in the project it would never be considered clean room.
But CDDL code is fine for a person or AI to learn from when building a new, incompatible implementation that doesn't share any code.