I am trying to find a way to enhance a filtering algorithm. I am developing a backup software that enables users to specify custom filters to exclude unwanted files/directories. Currently when trying to check if a file should be taken into backup, the path of the file is compared against a list of rules.
Each rule is specific to a single path (file or directory) and the path of the file is compared in a way similar to this:
Lets assume this is our rule structure
[Rule]
Path = "C:\Windows\"
Exclude = TRUE
And I compare the file path I got with the rule path by the length of the rule path.
Now, this works fine and I haven't had an issue with it but my issue is that I do not have a limit on the number of rules a user can add to the system. He can add as many rules as he likes and here is my problem.
The more rules a user adds means more iterations for each file against the rules set.
What is a good practice to do such thing?