With regard to overhead, memory consumption, resource usage and ease of code processing, which is the preferred method for parsing a large XML file?
- simpleXml_load_file
- simplexml_load_string
- Converting the simpleXML object to an array
- Converting simpleXML to cache
I am using simpleXML to parse a very large XML document that will return relevant search results requested by users.
$XMLproducts = simplexml_load_file("products.xml");
Along with ultimately producing requested search results, the simpleXML request will also produce links to further refine the obtained search results …
foreach($XMLproducts->product as $Product) {
if ($user_input_values == $applicable_xml_values) {
// all refined search filter links produced here, then displayed later
$refined_search_filter_Array1[] = URL code + (string)$Product->applicable_variable;
$refined_search_filter_Array2[] = URL code + (string)$Product->applicable_variable2;
}
… as well as help produce search results pages (because there will be 20 search results per page).
foreach($XMLproducts->product as $Product) {
//coding to produce pages number links for the search results pages number
}
Then we ultimately get to the actual search results requested by the user:
foreach($XMLproducts->product as $Product) {
if ($user_input_values == $applicable_xml_values) {
echo $Product->name ……
}}
Since the user can click on a number of refined search filter links as well as page number links to go to the next search results page, is it correct that it would be more constructive to turn the initial simpleXML request into an array or into cache until the user finishes using the search results? This way, when the user clicks on a refined search filter link or clicks on a link to go to the next search results page, s/he would be accessing the array or cache to do so, instead of loading the entire XML file (with another simpleXML request) to do so.
Thanks for any advice.