Tracking SharePoint Online with Multiple Content Owners

Creation date: 9/25/2024 11:11 PM    Updated: 9/25/2024 11:28 PM
In large SharePoint Online environments, it can be difficult to isolate each content owner's web analytics data.

This article explains how Angelfish can isolate web analytics data for each SPO content owner.

This solution works for Shared Services Teams, Departments, and any environment that has multiple content owners.


PREREQUISITES

This solution combines a few Angelfish administrator features, and assumes you are familiar with creating Profiles, Datasources, and Filters.

Please read the following articles before proceeding:



DEFINITIONS


Data Transfer Collection
A Collection whose main function is to download data from SPO and to create logs for Department Collections.  A Profile & Datasource is created in the Data Transfer Collection for each Department Collection.

Only Angelfish Global Admins should have access to the Data Transfer Collection.

Department Collection
A Collection which contains the Profiles, Datasources, and Filters for a department or content owner.  Each individual department / content owner should have its own Department Collection.

Each Department Collection has a Datasource that points at logs created by the Data Transfer Collection.  Collection Managers are used to manage each Department Collection.


SOLUTION OVERVIEW

  • This solution uses the Microsoft API method for tracking SharePoint Online
  • Collections are used to isolate the "data transfer" and "data processing" tasks
  • The Data Transfer Collection creates access logs for each Department Collection
  • Create a Department Collection for each separate Department / Content Owner 
  • Each Department Collection processes its respective access logs
  • All Collections contain Profiles, Datasources, and Filters 


REQUIREMENTS

  • A SPO login with appropriate permissions to use as a Service Account (see Microsoft API article)
  • A list of Department Collections
  • The paths for each SPO site / subsite (this can be used by the Restrict Content feature in each Datasource)
  • The paths for saving access logs



TASK 1: Create Data Transfer Collection


OVERVIEW

  • Each Profile in this Collection creates access logs
  • SPO logs are downloaded when Profiles are processed
  • Create a Profile & Datasource for each Department Collection
  • Use the "Exclude All" Filter to avoid storing unnecessary data


STEPS


1) Create the Data Transfer Collection
  • Choose a short & descriptive name for the Collection


2) Create a Service Account
  • Link it to the Collection Name created in step 1
  • This will be used by Datasources in this Collection
  • Enter the relevant Microsoft API details


3) Create a Datasource
  • Use the Service Account from step 2
  • If you plan to have multiple Department Collections:
    • Use a different path in the Location field for each Datasource
    • Enable "Restrict Content" and enter the correct hostname & path for the SPO site / subsite

Datasource Settings

Collection: <Data Transfer Collection Name>
Datasource Type: SharePoint Online
Authentication Type: Service Account
Location: <enter path where logs should be saved>
Restrict Content: <enable if you need to limit the content saved by this Datasource>


4) Create a Profile
  • Link the Datasource created in the previous step
  • Click Process Data at the last screen in the New Profile Wizard
  • When processing completes, view the reports
  • Verify the data in the reports is for the correct site/subsite

Profile Settings

Collection: <Data Transfer Collection Name>
Tracking Method: SPO
Hostname(s): <the hostname of your SPO site>


5) Create an "Exclude All" Filter
  • Add this Filter to the Profile created in step 4 once you verify the report data is correct
  • This Filter enables Profiles in the Data Transfer Collection to *only* download data

Filter Settings

Collection: <Data Transfer Collection Name>
Filter Level: Raw
Filter Type: Exclude
Field Name: Client IP (log)
Pattern Match: .*


6) Schedule the Profile to Run
  • In the Scheduling tab, select the desired processing frequency
  • Click the Save button


Repeat tasks 3-6 for each planned Department Collection


TASK 2: Create Department Collection(s)


Repeat this step for each separate Department or Content Owner that requires a Collection.


1) Create a Department Collection
  • Choose a short & descriptive name for the Collection


2) Create a Datasource
  • The Data Transfer Collection creates logs for this Datasource to use
  • Enter the location of these logs in the Location field
  • If logs aren't saved locally, select the appropriate Datasource Type

Datasource Settings

Collection: <Department Collection Name>
Datasource Type: Local
Location: <enter path to logs for this Department Collection>


3) Create a Profile
  • Link the Datasource created in the previous task
  • Click Process Data at the last step
  • When processing completes, view the reports
  • Verify the data in the reports is for the correct site/subsite
  • You can create multiple Profiles that use the same Datasource
  • Set & save a processing schedule for this Profile

Profile Settings

Collection: <Department Collection Name>
Tracking Method: SPO
Hostname(s): <the hostname of your SPO site>
Datasource: <Datasource from step 2>
Filters: <as needed>


4) Assign Collection Manager(s) to manage the Department Collection
  • This is accomplished in the config settings for each Collection, in the Collection Managers tab
  • Users have a global scope, which means one User can be a CM for multiple Collections