Writing to the phone call stream IS possible, but not from the app level on a stock (non rooted) phone.
When a phone call is initiated the mic is “typically” (really depends on the specific phone) routed directly to the baseband, ie skipping the main processor altogether.
For outgoing audio: mic->codec->baseband
For incoming audio: baseband->codec->speaker
If it were always routed: mic->codec->mainprocessor->codec->baseband
Then the stream would be “presumably” available if the Android APIs (frameworks) supported accessing it.
The reason I say it is possible is because the audio (for nearly all smartphones now) is connected via SlimBus This allows dynamic changing of audio paths. It is however done in the kernel via the codec driver living in ALSA.
So…. were you so motivated, you could get the source for the Linux kernel for a phone and modify the codec/ALSA driver to allow you to change what happens when the call audio path is setup.
Of course then you would incur latency with the new path, breaking the call/latency standards AT&T setup (that Audience helped them write…) and the baseband chip might reject your audio as it’s not timely.
Lastly you would need to modify the Android source (frameworks) to grow the API to support injecting audio onto that stream. (You’d need to make big mods to mediaserver, audioflinger in particular…)
It’s complicated, but there is your answer. Cheers, 🙂