I have a text file that contains webpage links in the following format:
linkname::linkurl
linkname::linkurl
...etc
I am using frgets to grab each line, split it and make a hypertext link with <a href="$linkurl">$linkname</a>. Simple stuff.
Now, it gets tougher, here. I am making a web interface to allow me to edit the list of links in the file. I have made a script to add new links easily enough, but when it comes to removing a line from the file (effectively removing the link from the site) I seem at a loss. I know I could spend alot of time, reading each line into variables and then removing the unwanted line and then reading them all back into the file and such, but is there a quicker way?? I thought maybe I could sewarch through the file, find the line i dont want, and then delete it right there, but i cant find anything to do that with. fseek only seems to search by bits, and i need something more perl like.
Any advice is greatly appreciated. I just want to avoid having to write many lines of code to do this when it seems like something that should be pretty simple to accomplish.
-fudge