This is for when the extension encounters something that makes it unable
to communicate to the server. It will disable until the user attempts to
shorten a link again
Fairly straightforward. Commit 298b2606ea also resolves#1 (http://lob.li/mk6)
The messages are finally where they belong and link resolution for ids and long urls added - I might want to work on lob.li/linkid in the future too
Whoops. http://lob.li/8pg
While functional in theory, header(location:index.php) will load the index again in the message field with non-functioning javascript
This token generation should prevent the same token from being used, also considering using a type of cooldown timer.
Short links were returning as (shortid)$, which didn't work
This thing is just a big pain and isn't really needed, so I'm removing it.
Really the options for getting the title or even the description involved downloading the whole file.
Pretty hacky way of doing this, fetches needed data from table and adds them to the redis db.
Run it from the command line: $ php redis.php
set your password as needed
Using a sorted set makes it easier to sort from highest to lowest on the stats page.
Minor change how tracking is done on the index and link checking has been updated.
expire() and setex() store as a countdown from now and doesn't persist it between server reboots and repopulation.
This is fixed by setting the keys to expire at a time in the future
The vast majority of short links are only used once and forgotten about.
This gives people a choice to have links that expire (default 24 hours after creation).
This is also the first time that we use either coloumn other than the center one.
My naming convention put caps on everything and I was calling them without such capital letters
Which breaks on pretty much any OS other than Windows...
Since I will be using Redis from multiple files, the connection to the server should be in the same file as the other DB connection.
This icon looks a little bit better I think.
This is purely asthetic. I may implement a robots.txt parser for this in the future if it doesn't spend
too much time parsing through to follow robots.txt rules. This also isn't technically a crawler
as it only checks if the remote server is online
I was wondering why my script had suddenly stopped working and thought it was
a Redis problem.
Turns out that I was getting PHP parse errors and didn't check the logs
This is very basic where I check the Regis table to see if a short link exists
If not, query the mysql server and pull that result into Regis as linkID and link
I might make some modifications to this
Trying to make it look a bit less cluttered and still have high contrast.
This works rather well I think.
And fix the revert mistake I just did when trying to commit this the last time
I did this... >.>