Web Applications That Deliver Real Business Solutions
RFC Express provides access to public lawsuit filings retrieved from the U.S. Federal District Courts and from justia.com
RFC Express came to Plego with the need for a solution that would crawl public databases for lawsuit documents and case records and post those cases to their website.
This solution would consolidate data from hundreds of courts and databases around the country into one dynamic lawsuit document repository. The documents would then be made available for purchase and cases would be made available to track for updates.
Plego created a custom C#, .Net and a SQL Server solution. We created multiple crawlers that gathered relevant information from federal courts and populated the website with lawsuit information and documents. These documents were then made available for purchase and the cases were made available to be tracked for updates for a fee.
Cases are pulled from both areas (Justia and Public databases) via crawlers. This process runs as a cron job. The crawlers retrieve and save data into a MYSQL database, then the required data is transferred to a Microsoft SQL Server database.
Plego then created a custom shopping cart and checkout module for users to purchase digital documentation and track lawsuits. The documents were retrieved on demand. A user could go to any case, request documents, and our on demand crawler would re-access the courts database to search and crawl the latest documents with each case. Since documents are constantly being added and updated for cases, this crawling process was done on demand, for a fee, based on the user triggering a request.
A subscription based model was also introduced, which allowed power users to gain constant access to the site for a monthly fee.
The “Lawsuit Tracker” was created so that users could receive text/email updates the moment a change was made to a particular lawsuit the user was “following”. If there were any documents added to the suit, a process would automatically email and text users with a link to the updated information for the user to retrieve.
In addition, a full administrative utility was created to administer the millions of cases that are on the site. The admin section is built to be a monitoring tool, which lets the admin know which crawler has been executed and which crawler did not fully execute and retrieve the required data. Error codes are presented and the admin alerted for any failed executions via email and/or text.