Advanced Info & Tips

Creation date: 4/20/2022 6:51 PM    Updated: 5/27/2023 12:48 PM
This article contains an array of best practices, advanced admin info, and tips that we've learned over the years.  We'll add more tips to this article as we learn them!

This article is intended for current Angelfish Admins who are already familiar with admin tasks and terms.

USE THE COPY BUTTON
  • You can make a copy of each Object (Profile, Datasource, etc.) via the Copy button
  • Investigate issues in the copy, then apply changes to the original
  • Use a Tag or add a string to the Object name so they can be deleted later
  • e.g. add a Tag of "copy" or prepend "TMP-" to the name


REPROCESSING DATA
  • Starting with v2.44, each Profile's Run/Data Management tab contains a clickable "Reprocessing Utility" blue link - click it to open the utility.
  • This utility lets you delete and reprocess data for a specified date range in one step
  • More info here:
Help Article: Reprocessing Utility


THE "UPDATE & REPROCESS" METHOD
  • This is the best way to identify and eliminate bad data from the reports in a new Profile
  • The first time you process data, only process a few small logs or one large log.
  • Open the reports and look for unwanted or missing data.
  • Reports to use: Browsers, Platforms, Usernames, Organizations, IP Addresses, any of the Content reports, Top Files in IT Reports
  • When you identify something to exclude, update the relevant config settings | Filters
  • Use the Reprocessing Utility to delete all data in the Profile and process the same logs again
  • Once the reports look right, use the Reprocessing Utility to process a larger group of logs
  • Go through as many iterations of Update & Reprocess as you need to clean up your reports
  • If you notice undesirable data in the future, make a copy of the Profile
  • Use the copy to test changes before applying them to the active Profile
  • Use the Delete Data: Specific Visits utility to remove undesirable data from an active Profile (see Deleting Data, below)


DATA FILES
  • Report data files are stored in a folder for each Profile: ../data/<profile-id>/
  • agfs-vs.db contains Visit-level report data
  • agfs-it.db contains Hit-level reports data (shown in the IT Reports section)
  • lt.db contains the Profile's log tracking info
  • These files are not linked to the Profile ID, meaning you can move them into another directory
  • If you process data in a copy of a Profile, you can replace the report data files in the original Profile with the data files from the copy


DELETING DATA
  • The Delete Data section in the Run / Data Management tab in each Profile provides three ways to remove data:
    • All Data: removes all report data AND all log tracking info from the Profile
    • Custom Range: removes report data between a specified start and end date
    • Specific Visits: removes report data that matches a Dimension pattern between a specified start and end date 
  • Use the Specific Visits option to satisfy data deletion requests

Help Article: Deleting Data


USE TWO PROFILES FOR ONE WEBSITE
  • Each Tracking Method provides a unique perspective in the reports
  • Create the second Profile by making a copy of the original and updating the Tracking Method
  • If the original Profile uses a JS-based Tracking Method, use a log-based method in the copy
  • The Log-based Profile will contain traffic from Visits that block JavaScript


COMBINE DATA FROM TRACKING METHODS
  • Each Profile uses one Tracking Method, and you can update the Tracking Method at any time
  • You can process historical logs using one Tracking Method, then switch to another for future logs
  • Some fields only exist in JS-based Tracking Methods, like Page Title


LOG FILE MANAGEMENT
  • It's a good idea to save your web logs: you can always reprocess data if you have the logs
  • Logs compress very well (90% or more), and Angelfish reads compressed logs. 
  • Supported formats: zip, gzip, bzip2, 7zip


DATE SUBSTITUTION IN PATHS & FILE NAMES
  • Date substitution statements can be used in both file names and in paths
  • UNC Example:
    • Actual Path:
    • \\10.1.1.35\logs\2022\10\31\u_ex221031.log
    • Path w/ Date Substitution:
    • \\10.1.1.35\logs\YYYY\MM\DD\u_exYYMM.*