Events from .Net

Target version of the windows application using the tracker should be 4.6 or above.

Step 1: Installing the tracker

Download Snowplow.Datacoral.Tracker.dll from below s3 location and add it as a references to your solution.

aws s3 cp s3://datacoral-install-us-west-2/DotNetTracker/Snowplow.Datacoral.Tracker.dll .

Step 2: Using the tracker

Importing the library

Add the following using lines to the top of your .cs scripts to access the Tracker:

using Snowplow.Tracker.Emitters;
using Snowplow.Tracker.Endpoints;
using Snowplow.Tracker.Logging;
using Snowplow.Tracker.Models;
using Snowplow.Tracker.Models.Events;
using Snowplow.Tracker.Models.Adapters;
using Snowplow.Tracker.Queues;
using Snowplow.Tracker.Storage;

Creating a tracker

To use the Tracker in your code simply instantiate the Tracker interface with the following snippet. Datacoral will provide the Collector endpoint URL and the different supported environments along with their corresponding api keys.

// Create logger
var logger = new ConsoleLogger();
// Controls the sending of events
var endpoint = new SnowplowHttpCollectorEndpoint(
"Collector endpoint URL",
method: HttpMethod.POST,
port: null,
protocol: HttpProtocol.HTTPS,
xDataCoralEvmt: "ENRVIRONMENT",
xApiKey: "API_KEY",
l: logger
// Controls the storage of events
// NOTE: You must dispose of storage yourself when closing your application!
var storage = new DefaultStorage();
// Controls queueing events
var queue = new PersistentBlockingQueue(storage, new PayloadToJsonString());
// Controls pulling events of the queue and pushing them to the sender
var emitter = new AsyncEmitter(endpoint, queue, l: logger);
// Contains information about who you are tracking
var subject = new Subject().SetPlatform(Platform.Mob).SetLang("EN");
Tracker.Instance.Start(emitter: emitter,
subject: subject,
trackerNamespace: "NAMESPACE",
appId: "APP_ID",
l: logger);

This starts a global singleton Tracker which can be accessed anywhere via the Tracker.Tracker.Instance.{{ method }} chain. Once this is run everything should be in place for asynchronous event tracking.

Tracking specific events
*Track* function used for Tracking all events. You can pass any of the *IEvent* types to this function. ```dotnet Tracker.Tracker.Instance.Track(IEvent newEvent); ```

Example of tracking structured events:

t1.Track(new Structured()
.SetLabel("Add To Basket")

Tracking unstructured events

Use Track(SelfDescribing) to track a custom event which consists of a name and an unstructured set of properties

Example event json to track:

data: {
name: 'viewed_product',
productId: 'ASO01043',
category: 'Dresses',
brand: 'ACME',
returning: true,
price: 49.95,
sizes: ['xs', 's', 'l', 'xl', 'xxl'],
availableSince: new Date(2013,3,7)
// Create a Dictionary of your event data
Dictionary<string, object> eventDict = new Dictionary<string, object>();
eventDict.Add("event_name", "viewed_product");
eventDict.Add("productId", "ASO01043");
eventDict.Add("category", "Dresses");
eventDict.Add("brand", "ACME");
// Create your event data
SelfDescribingJson eventData = new SelfDescribingJson("iglu:com.acme/save_game/jsonschema/1-0-0", eventDict);
// Track your event with your custom event data
t1.Track(new SelfDescribing()

The event name will be reflected in the event_name column in redshift and the data field above will be reflected in the ue_data JSON column.

Custom Contexts

Custom contexts can be used to augment any standard event type, including unstructured events, with additional data. Each custom context follows the same structure as unstructured event. Even if only one custom context is being attached to an event, it still needs to be wrapped in an array.


data: {
context_name: 'experiment'
key1: 'value1',
key2: 'value2',
GenericContext context = new GenericContext()
.Add("context_name", "experiment")
.Add("key1", "value1")
.Add("key2", "value2")
List<IContext> contextList = new List<IContext>();
t1.Track(new PageView()

Contexts will be loaded to separate columns in Redshift table as JSON string. You should specify a name ('experiment' in the above examples), and make sure columns with ctx_ prefix ('ctx_experiment') exist in Redshift table.

For more information refer to