Inserting 200 million records from a staging table and inserting them into permanent tables in a single transaction is ambitious. It would be a useful if you had a scheme for dividing the records from table A into chunks which could be processed in discrete chunks.
Without seeing your code it's hard to tell but I have a suspicion you are attempting this RBAR rather than a more efficient set-based approach. I think the key here is to de-couple the insertions from clearing down table A. Insert all the records, than zap A at your leisure. Something like this
insert all
when p = 'X' then into b
when p = 'Y' then into c
when p = 'Z' then into d
select * from a;
truncate table a;