There is no issue here. All server databases make extensive use of caching and do not release memory unless required, eg because the available free RAM falls below some threshold. The server will not run out of RAM. Trying to "free" the memory will degrade performance as the server will have to read the data back from storage.
As for the handing queries, it probably has more to do with the fact that you have issues 200K insert statements inside a single transaction and you are probably experience locking or deadlock issues.
This an artificial example that doesn't display the server's behavior in batch operations. Batches are never 200K statements long. If you want to import 200K rows, use BULK INSERT on the server's side or SqlBulkCopy on the client.
SQL Server is used in Data Warehouse applications where a typical import size is millions of rows. The trick is to use the proper tools for this job, eg BULK INSERT, SSIS and proper use of staging databases.