Spread The Love

From GeoWiki
Jump to: navigation, search

Introduction

There are a number of challenge caches around which require the finder to have found a certain number of caches which hadn't previously been found for more than a certain period. Examples are:

GC3KWWQ - Spread the Love Challenge [1]

This cache requires the finder to find enough cache that haven't been found for at least 183 days so that the total number of days is greater than 15 years (5479 days).

GC47J4N - Challenge: Fifteen Forgotten Years [2]

This is essentially the same as the Spread the Love Challenge.

GC45RN3 - Challenge: Five Forgotten Years [3]

As above, but the total is only 5 years (1826 days)

Note: This has been ported to a GSAK Macro: http://gsak.net/board/index.php?showtopic=28564

Finding your finds on unloved caches

Unfortunately it can be quite hard to tease out these unless you have a few which you already know of. Especially if you are making them up from lots of 6-12 month finds. So to make it easier I've written some SQL and PowerShell code to generate the statistics for you.

What you need

  • A GSAK database with all your finds in it. Note: This DB will also need the logs for the previous finder or the published log if it was an FTF
  • You need a Windows Vista or Windows 7 machine (or have PowerShell installed on Windows XP)
  • A copy of the SQLite command shell for Windows [4]
  • OPTIONAL: A spreadsheet program like Excel to view the data
  • The code below

What to do

Extract the data

The first step it to extract all the logs on all your found cache from GSAK. You need to know the following:

  • The path to your GSAK database. This can be found by using the folder finder in GSAK. Tool --> Folder Finder. For example, my database is located here:
C:\Users\Jeremy\AppData\Roaming\gsak\data\Default\sqlite.db3
  • The path to the sqliste3.exe file you downloaded

Now do the following:

  • Open up a command prompt.
  • Open up the GSAK DB in the SQLite command shell (make sure GSAK is not running at this time). You do this by typing the path to the sqlite3.exe file and then the path to the DB separated by a space. For me this was like this:
    "D:\Downloads\SQLite\sqlite3.exe" "C:\Users\Jeremy\AppData\Roaming\gsak\data\Default\sqlite.db3"
  • Type or paste the following into the sqlite> command prompt to redirect the output to a file.
    .output FoundItLogsOnFoundCaches.txt
  • Type or paste the following to extract the logs to the file.
    SELECT Logs.lParent,Caches.Name,Caches.CacheType,Logs.lBy,Logs.lDate FROM Caches INNER JOIN Logs ON Caches.Code = Logs.lParent WHERE Caches.Found = 1 AND (Logs.lType = 'Found it' OR Logs.lType = 'Publish Listing') ORDER BY Logs.lParent ASC,Logs.lDate DESC;
  • Type the following to exit out of sqlite.
    .quit

Process the data

Now you should have all the data in a text file (which may be quite large) called "FoundItLogsOnFoundCaches.txt" in the current directory. To process the data:

  • Open PowerShell. Just type:
    powershell
  • Copy the following text into notepad and replace my cacher name at the top with your own.
$YourCacherName = "noikmeister"
$Header = "Code","CacheName","Type","Name","Date","LogType"
$Logs = Import-Csv -Delimiter "|" -Header $Header -Path .\FoundItLogsOnFoundCaches.txt
"Code`tCacheName`tNextDate`tYour Date`tDays Between`tURL`tType" | Out-File -FilePath .\QualifyingCaches.tsv
$Logs | ForEach {
	If ($_.LogType -eq "Publish Listing") {$Published = $True}
	If ($_.Name -eq $YourCacherName) {
		$CompareNext=$True
		$MyLog = $_
	} Else {
		If ($CompareNext) {
			If ($Published) {$CompareNext = $False}
			If ($MyLog.Code -eq $_.Code) {
				If ($MyLog.Date -ne $_.Date) {
					$DateDiff = (Get-Date($MyLog.Date)) - (Get-Date($_.Date))
					If ($DateDiff.Days -gt 182) {
						"Qualifying cache found: " + $MyLog.Code + "`t" + $MyLog.CacheName
						$QualCodes += "'" + $MyLog.Code + "'"
						$MyLog.Code + "`t" + $MyLog.CacheName + "`t" + $_.Date + "`t" + $MyLog.Date + "`t" + $DateDiff.Days + "`thttp://coord.info/" + $MyLog.Code + "`t" + $_.Type | Out-File -FilePath .\QualifyingCaches.tsv -Append
					}
					$CompareNext = $False
					If (Test-Path Variable:DateDiff) {Clear-Item Variable:DateDiff}
					If (Test-Path Variable:MyLog) {Clear-Item Variable:MyLog}
				}
			} Else {
				$Published = $False
				$CompareNext = $False
				If (Test-Path Variable:DateDiff) {Clear-Item Variable:DateDiff}
				If (Test-Path Variable:MyLog) {Clear-Item Variable:MyLog}
			}
		}
	}
	If ($LastLog.CacheName -ne $_.CacheName) {$Published = $False}
	$LastLog = $_
}
  • Now paste the code into the PowerShell window (right-click pastes). If you end up with a ">>" prompt at the end, press ENTER a couple of times to get the process going.


What does it do?

  • The query pulls out all cache logs that have the status of Found in GSAK. It also orders the output by cache code and then log date in descending order (i.e. newest finds first). So it is crucial that the GSAK DB has all of your found caches and it knows they are found (i.e. Yellow). It also relies on the next find or publish log being in GSAK.
  • The PowerShell processes the logs and finds ones where the next log is more than 182 days in the past and outputs them to a tab-separated values file called QualifyingCaches.tsv in the current directory. It also puts some output to the screen so you can tell it is working. You can open the tsv file in Excel and sort or formal etc.
  • Here is a rough approximate of how it works in english:
    • If the log is mine, then store it and read the next log.
    • If the next log is for the same cache and isn't on the same day then compare the dates and output the results if the time span is greater than 182 days. The logic behind it ignoring logs on the same day is that you may have found the cache with other people, so you want to ignore those logs.