I've been tasked with generating a CSV file with the contents of a view in
an old database as an extra copy before the DB is decommissioned.
I've done this in the past for smaller tables or views without issue.
There are 80 million rows in the view, full row length is 5500 bytes (most
probably shorter since the columns on the base tables are mostly varchar2).
Doing the math on that, that's a maximum possible size of 440GB.
I had seen a stored procedure on a website somewhere that read through a
cursor and dumped the records out to a text file via utl_file.put. Anyone
have a better method or is that my best option ? I've had some TEMP
tablespace failures, I may just need to allocate a bunch of space in temp,
kick it off one afternoon and let it run overnight.
I've could shrink the overall output size of the file by running "trim" on
the columns too.
Any ideas appreciated.
Environment: Oracle 126.96.36.199 on HP-UX 11.11