1

I have a use case where a large amount of data sometimes a million rows added to a temporary table (session global temp table) and the table to be joined to another table to produce results/graph/heatmap. The data is text files or could come from a named pipe/stream/socket. SQLLDR will not be an option as it is a temporary table.

Do you have suggestions as to how this can be done efficiently. Currently SQL statement looks like below:

insert into tempipdata (IP)
select gutils.ip2long('100.0.59.165') FROM DUAL UNION
select gutils.ip2long('100.1.117.161') FROM DUAL UNION
select gutils.ip2long('100.23.117.161') FROM DUAL;

Any suggestions. I could get the IP Address data from a socket or stdin of output of a program.

Thanks Vijay

2
  • Why don't you go for Oracle's nested tables? Creating, populating and deleting large temporary tables can fragment your tablespaces.
    – Rachcha
    Commented Apr 7, 2013 at 20:47
  • @Rachcha - inserting a million rows into a PL/SQL collection could place a huge strain on the PGA. Sometimes a temporary table is the right solution.
    – APC
    Commented Apr 7, 2013 at 20:50

1 Answer 1

4

For accessing data from an OS file your best solution would be external tables. These are just like normal tables, only the data comes from CSV (or whatever) files. So we can use SQL against them. It's a better solution than temporary tables because you won't have to load the data first: you can select it straight away. Find out more.

2
  • Hmm.. External tables will not be a great idea. At least a 3-trie system of DB->App->Web service is how this is hosted. The user's who input/upload their data (IP address in this case) will have to do it on the same system as the database - which generally I don't consider scalable. The data is not needed or useful once the join query is complete and the graph/plot/heatmap is drawn. So it fits nicely for a temporary session based table criterea.
    – Vijay
    Commented Apr 7, 2013 at 22:40
  • I think I should just limit the temp table to have about 100,000 rows and break up user's large query into three workers and use my Appserver to combine the results and develop the plot/graph/etc..
    – Vijay
    Commented Apr 7, 2013 at 22:41

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.