I need to get the entire days sales as a report VIA THE API AS JSON, broken down into items sold - very standard stuff. It needs to go to a hospitality provider.

Which endpoint do I get it from / what is the call?

Thanks

]]>I'll paste a screen shot below. An item that sold at a shop other than what Lightspeed had the on-hand listed at. Due to it not being transferred in the system. Our Short North shop shows -1 QoH, while our MXC shop shows 1 QoH. Shouldn't reporting net these out to show a sum total of 0 QoH? I would assume that either the Lightspeed Analytics reports or the canned Lightspeed Retail reports would just foot out the totals between all shops combined, correct? The Lightspeed Analytics report does break out the shops with the multi-store array option, but even then it doesn't appear correctly.

With the option to allow negative inventory sales, did all of the affected reports receive updates to handle things? I know we can run the Lightspeed Retail canned Negative Inventory report, and then transfer stock to properly zero things out. But until we do it appears as if reporting will be incorrect.

Complete newbie at Lightspeed. Would like to query an item. Using https://github.com/sayonetech/lightspeed-api-python-client but question is not about python. It is about this error:

content=b'{"httpCode":"401","httpMessage":"Unauthorized","message":"Invalid access token.","errorClass":"BadAuthenticationRequestException"}'

How do I go about asking the retailer for an authorization token?

Also, if you are interested in assisting me on an hourly fee basis please contact me at gary.kuipers@casinfosystems.com

Thank you

]]>Specifically to access Transfer Reports, that auto populate (update) data from the site.

I'm looking to simplify my company's transfer reporting for data tracking.

At the moment, I format the data into a table, and filter for the appropriate transfer number.

However, my coworkers are not quite as savvy. I would like to pull the data and have one set workbook, where they can utilize slicers. And don't have to constantly export a (.xlsx) of the data.

]]>Often, when running a retail store, we'll have some products prepared as Matrices, and other products, not.

When we're running Sales reports for Matrix products, we may not be interested in which size or colour sold, simply the *style* of product. For this kind of report, we could run a Sales report with a Dimension of "Matrix"

This is a good start, but we soon find a limitation: All the products that are not matrices are gathered into one null Matrix line.

However, if we try to find out what products *these* are, we need to add the Dimension of "Description"...

So this clarifies the *non-Matrix* products, but also separates the Matrix products.

Wouldn't it be nice if there was a way to gather Products together if they're part of a Matrix, and if not, not to.

(Look at that! What we're looking for almost completely lines up as an IF statement!)

**Step One: Is the Product a Matrix?**

We could create an IF statement to say: if a Product is a Matrix, return its Matrix value, if not, return its Description.

How can which Products are Matrix Products? Let's look at the data for a minute...

Products that ** are not** Matrices have a null "Matrix" description.

Products ** are** Matrices, have a Matrix description.

So let’s start by going to our Custom Fields, and creating a “new” Custom Dimension…

This should open up the Custom Dimension formula editor...

And to start, let’s just add the “is_null” function…

And then let’s look for the Matrix.

When we save the function and run the report again, we’ll get the new column which, if there is a “Matrix” description, will show “No”, and if there is no Matrix description, will show “Yes”.

Let’s save our report as new…

So now, let’s go back to our “Null Matrix” Custom Dimension, click on the gear icon, and duplicate…

And now, let’s open the "Copy" add around the calculation our if statement: If is_null is “No”...

then return the Matrix…

If is_null is not “No”, then return the Product Description…

So now, when we save and run, we should see the Matrix description where one is available, and we should a Product description if no Matrix is available…

Again, let’s save as new…

**Step Two: A bit of Cleanup**

Now that we have a Matrix or Description field, we can hover over the starter dimensions of "Description", select the gear icon, and "Remove"

Let's do the same for the "Matrix" and for our first "Null Matrix" Dimensions.

When we run the report again, it will gather the respective descriptions together...

If we want some additional clarity, we could make a duplicate of our "Null Matrix Copy"...

and replace the rules with labels:

This way, when we save and run our report

We'll also get identifiers about whether we are looking at a matrix grouping or a single product!

Spoiler Alerts! These are the Custom Dimensions formulae that we used today

**Matrix or Item Name**

`if((is_null(${item_matrices.description}))=no,${item_matrices.description},${items.description})`

**Matrix of Item Label**

`if((is_null(${item_matrices.description}))=no,"Matrix","Description")`

15082

]]>Now perhaps we want to prepare a report that will show us Products Sold by Size.

What we could do is just start a new Sales Report, and as our Dimension, just use **Item> Matrix> Attribute 2**

(this is where the default "Size" convention lives in Analytics, alongside that of any custom matrices that you may have created in Retail)

Then, let's just add one measure: Sales Line Quantity Sold

So this will show us, of all our Matrix products, the totals sold by size. But, most stores (including ours, we're only human) will experience a few problems here.

** First,** most stores have not historically used any naming convention to their Matrix attributes, so sizes, over the course of time, may have been created as "L", "Large", "LG", etc

This means that similar products are not gathered together. In the example above, 547 "L" Products were sold, but so were 33 "Large" Products

** Second**, neither Lightspeed nor Analytics will sort sizes according to size. The closest is sorting Alphabetically: in which case we will get:

Large, Medium, Small, X-Large, X-Small

Both data challenges make reading the results, well, challenging.

So, can we make any easy changes that will help us make more meaningful sense out of this?

Yes we can.

**Step One: Redefining one value of Attribute 2**

In an earlier article, we looked at using Multiple If Statements to return results from logical tests. We could use Custom Dimensions to do the same thing with the Sizes that exist in our data.

To begin a new Custom Dimension, navigate to Custom Fields> New> Custom Dimension

This will open up the editor for our Custom Dimension calculation...

and let's start with a one-rule IF statement. Let's say, from my results above, that I want to qualify "Large" as "L"

So I'll start my IF statement:

and remember. A simple IF function is constructed as follows:

**IF** ( Logical Test, *Results if True*, *Results if False)*

So I'm going to say, IF Attribute 2 is "Large", "L"

and just for now, let's say if Attribute 2 is not "Large", just make null...

So when I save my calculation and run the report again...

I'll see here that "Large" returns "L"!

So now, let's change the results if false. I'm going to go back to my Custom Dimension, click on the gear icon, and select "Edit"

and now, instead of "null" being my results if not true, I will change the rule to being "Attribute 2"

So now my calculation will look like this:

and when I save the calculation and run the report again, I'll see something different:

The only rule I checked for is if Attribute 2 is "Large", all the other sizes that do not fit this criteria kept their original definition.

**Step Two: Redefining multiple values of Attribute 2**

Now that I know I can change the Size convention, I want to be able to do that for multiple sizes. So, we need add some more IF statements in our original Custom Dimension.

Just to be safe, I'm going to go to my "Redefined Size" Custom Dimension, click on the gear icon, and select "Duplicate"

This way, I'll get a copy of the Custom Dimension that I can work with (and maybe break), but fairly consequence free.

So for multiple IF statements, the logic follows this process

**IF** ( Logical Test 1, *Results if True*, **IF** ( Logical Test 2, *Results if True*, *Results if Both Tests are False))*

Clicking on the gear icon of the Copy, let's select "Edit"

Then we'll add some more rules...

and closing off our calculation with results if not true (Attribute 2), then closing brackets for every IF statement

So let's save this, and run this...

So our "Copy" function looks much cleaner than the first two. But there's an even better way to test.

Let's remove the first Dimensions,

and only leave the last copy of our Custom Dimension, and run.

Here are our results, the exact same *data*...

But now it's much more organized.

Let's save our work here as new

**Step Three: Define a Sorting Convention**

So now our data is cleaner and much easier to read. The only thing missing now is a meaningful sorting convention.

This too, can be done using a Multiple IF statement.

Let's start by creating a new Custom Dimension:

And now, we'll start an IF statement, looking at our Redefined Sizes Copy...

Beginning with the very smallest option...

and I'm going to define the sizes in increments of 10,* (this way, if more sizes are added in time, I can position them between the existing values)*

and then going up the Size scale...

So when we save and run...

Each size is now assigned to a numeric value, so when we sort by "Sorting"

We're getting closer to the report that we've been looking for!

Let's save our results as new

**Step Four: A bit of cleanup**

The only reason we want the Sorting dimension is to sort the meaningful dimensions. We don't need for it to be visible on our report.

So let's hover over its gear icon, and select "Hide from Visualization"

Now, if we plot this on a table or graph

It becomes the graph we wanted all along!

*Clever observers may have noticed that I pushed "Men's" and "Women's" sizes both to the Simple version of Size Conventions, keep in mind though, nothing is stopping us from doing similarly with the model type, pivoting perhaps on "Build"

To create something like...

14988

]]>However, we can use Custom Dimensions to identify which days' sales in years past should be considered in relation to this month.

This process is a bit different than the one that we look at in Relative Date of the Year, but the end goal is the same: identifying the sales from last year that align with this month's sales-to-date

**Thinking about which sales to include**

Let's say that today is May the 28th, 2019. So far, for this month, Analytics will show us Sales from May 1st through May 27th. To compare our trajectory, we may want to look at Sales for May 1st through May 28th 2018 (or further back, etc)

This is actually not a complicated report to prepare. There are two rules that we want to apply to all Sales so that their results can be measured against this Month-to-Date.

**Rule 1: **We want the month of the Sale to be *the same *as the month now.

**Rule 2:** We want the day of the Sale to be *less than* the day now.

Let's open up a Recent Sales report and let's remove the filters for this week...

**Step One: What is the Month of the Sale?**

So let's go our Custom Fields, and a New Custom Dimension...

This will bring up the field to create our new Custom Dimension

Using our Date-Functions, we'll start with the extract_months function....

...when we add it to the editor, it will look like this:

...then, between the brackets, we'll look for the field "Sale Complete Date"

When I add it to the editor it should look like this* (title added to illustrate)*:

So when we save, run the report again, and our data should look like this...

So our sales in January should show up as a 1, sales in February as a 2, and so on...

**Step Two: What is the Month Now?**

Similarly, let's add a new Custom Dimension...

and start extracting the Month...

But now, we'll extract the month from, now.

So now, when I save and run my report again, I will see a second column that only shows me the number 5 (as "now", currently being May, corresponds to the number 5)

Let's save our report work as new...

**Step Three: What is the day of the Sale, and the day Now?**

Similarly to the month calculations in Steps One and Two, we also want to find extract the day from the Sale, as well as the day from now.

Let's add a new Custom Dimension to extract the day from the Sale...

...and a new Custom Dimension to extract the day from Now...

So when we save and run the report, we'll get two new columns, highlighting all our source numbers.

...and once again, let's save our report as new.

**Step Four: Where are the Months the same?**

So now, let's apply our first rule, the Month of the Sale is the same as the month it is now

If we go to our Custom Dimensions, let's start by hovering over the "Month of Sale" and selecting "Duplicate"

This will make a copy of the calculation that will make the next step easier.

On the "Month of Sale Copy", let's hover to get the gear icon, and select "Edit"

This will open up the editor with a copy of our original calculation:

So now, let's add =

...then add our calculation to extract the month from now.

So my calculation now looks like this:

`extract_months(${sales.time_stamp_date}) = extract_months(now())`

And when we save and run the report, we'll get a new column...

Where the month is the same, we'll get a "Yes", where it is not the same, we'll get a "No"

And let's save our work so far as new...

**Step Five: Where are the Sales days less than the day now?**

Similarly, let's go back to our Custom Dimensions, select the gear icon on "Day of Sale", and select "Duplicate"

...and in our copy, we'll add the second rule, is less than the day now...

so my calculation now looks like

`extract_days(${sales.time_stamp_date}) < extract_days(now())`

and when I save and run the report

We'll get the new column showing us "Yes" if the day is earlier, "No" if the day is not earlier.

Once again, let's save.

**Step Six, where are both rules true?**

So now, I have two simple rules that, when combined, will highlight all relative Months-to-date in years past.

Let's Duplicate the Custom Dimension for "Same Month as Now"

On the new Copy, let's select Edit

and then add the function: AND

then add a copy of the same calculation from our "Earlier Day" Custom Dimension

So now my calculation looks like...

`extract_months(${sales.time_stamp_date}) = extract_months(now()) AND extract_days(${sales.time_stamp_date}) < extract_days(now())`

...and when I save and run the report, I will get this final column:

If *any *of the first two rules is "No", then the Final column will be "No", but if *both* rules are "Yes", the final Column will be "Yes"

So let's save.

**Step Seven: Applying the Custom Dimension**

So now, we have illustrated that the Final Column is identifying the sales we wish to compare. However, we may not need any of these columns to be visible on our report.

So, let's remove the starter columns here,

* (in truth, we could completely delete the starter columns from the Custom Dimensions completely, as all of the starter functions are now implicit in our final Custom Dimension, however, use with caution as once the Custom Dimension is deleted, there is no way to recover it)*

So, now that we have dropped all the "started" details, let's say, pivot on the Sale Year

So now we can easily see, of all the sales from years past, which ones are comparable to this month to date?

We could also filter on our new Custom Dimension

Then perhaps only look at Sales by Brand in this time frame, year over year

and so on!

*Spoiler Alert, these are the Custom Dimension formulae we used today*

**Sale Month is the Same Month as now**

`extract_months(${sales.time_stamp_date}) = extract_months(now())`

**Date is Earlier in Month than now**

`extract_days(${sales.time_stamp_date}) < extract_days(now())`

**Both Rules are true**

`extract_months(${sales.time_stamp_date}) = extract_months(now()) AND extract_days(${sales.time_stamp_date}) < extract_days(now())`

14970

]]>However, we can use Custom Dimensions to identify which days' sales in years past should be considered in relation to this year.

A good way to start is to identify the relative-date of the sale.

**Thinking about which days to include**

Let's say that today is May 27th, 2019. I want to compare my sales for January 1st through May 27th 2019 against the sales for January 1st-May 27th 2018 (and so-on in reverse).

To start, we want to plot all sales to a position against *this* year. This means that if a sale took place on May the 4th 2016, that we would align it with May the 4th, 2019.

Secondly, we want to plot where* this years' relative sales* are in relation to *now*. This means that if a sale took place on June 10th 2018, we don't want include it, yet.

**Step One: The Relative Date**

Using our date functions, we can use "Date" to create a* relative date* this year for all sales. We can use the "Extract Year" function to identify which year* this* year is, and we can use the "Now" function to identify the position in the year that we currently occupy.

So let's start by launching the "Recent Sales" report, and let's remove the filter for Sales Date is in the past 1 Weeks

Then let's open the Custom Fields...

and then select **New> Custom Dimension**

Remember that the "Date" function requires three arguments: the Year, the Month, and the Day

To plot all sales against this year, let's extract_year...

...from now...

To plot the month, let's extract_months

From the completed sales date...

...and let's do the same for the day...

So, our final "relative date" function should look like this:

`date((extract_years(now())),(extract_months(${sales.time_stamp_date})),(extract_days(${sales.time_stamp_date})))`

When we save...

The "Relative Date" should look at every sale date, whether it's a sale from this year or years past, and then make the year, this year.

Following our best practices. Let's save our work.

As new...

**Step Two: Date Now**

Of course, it is not necessarily the date of the sale that we want to measure itself. We want to compare the sales relative dates in comparison to now.

So, let's a new Custom Dimension...

...which will be a date function...

But now, we will extract the year, the month, and the day from *now*.

So our function will look like:

`date((extract_years(now())),(extract_months(now())),(extract_days(now())))`

So our report now looks like:

And saving As New

**Step Three: Relation between Relative Date and Now Date**

Now we want to look at the difference between the relative days of sales now.

To do this, let's add a new Custom Dimension...

...and we'll use the diff_days function to count the difference.

Diff_Days requires two arguments: the first date, and the second date

So, for our Start Date, let's copy the function for our Relative Date...

and paste...

and for our end date, let's copy the function for our Now Date, copying...

...and pasting...

So my final function for differences of days is...

`diff_days((date((extract_years(now())),(extract_months(${sales.time_stamp_date})),(extract_days(${sales.time_stamp_date})))),(date((extract_years(now())),(extract_months(now())),(extract_days(now())))))`

...and when we run the report, we will get a number counting the difference

Notice anything about this number?

If the relative day is *after* now, (or, later in the year) the difference is ** negative**.

If the relative day is *before* now, (or earlier in the year) the difference is ** positive**.

Let's save our work as new.

**Step Four: Include or not?**

So, can you guess what our final function will be to see if we should include it in our comparison or not?

That's right!

Is the difference of days greater than zero?

To start this, let click on the gear icon next to our "Difference Days" function and select "Duplicate"

This will create a "Difference in Days Copy" function that we can easily add our final test onto...

To do this, click on the gear icon on the "Copy" and select "Edit"

This will open up the copy of our function...

...on which we will add one final argument: "is greater than zero"

So my function now looks like:

`diff_days((date((extract_years(now())),(extract_months(${sales.time_stamp_date})),(extract_days(${sales.time_stamp_date})))),(date((extract_years(now())),(extract_months(now())),(extract_days(now())))))>0`

And when I save, it will look like:

So now, we have a new simple Y/N Dimension, was this sale earlier in the year in relation to today? THIS is the Dimension we'll be using for our analysis now.

Let's save our work.

**Step Five: Pivot or Filter using this Dimension**

Now that we have created the Custom Dimension "Is Earlier", we can remove most of the other dimensions from our data. After all, we may not be interested in reporting on the date itself. So for now, let's let's remove the "Completed Date"...

as well as our starter Custom Dimensions...

But when we get to "Is Earlier", let's, for now, Pivot on the results...

...and to illustrate, let's add a Dimension of the Sale Completed Year

So when we run, we can see a simple model

Showing us for each year, our relative Year-to-Date total!

**A Few more Options**

For additional analysis, we may wish to Filter on the custom dimension of "Is Earlier?"

and let's filter for "Is Earlier" is "Yes"

Giving us now one number to compare this year's totals to the trajectories of years past

Note: This is a really good time to use the Visualization of ...> Funnel

As it compares the top result as a percentage comparison to the other results...

We could also pivot on the year of the sale, and perhaps use a Primary dimension of Brand, to see how Brands are performing relatively Year-to-date

and so on...

*Spoiler Alert, these are the Custom Dimension formulae we used today*

**Relative Date**

`date((extract_years(now())),(extract_months(${sales.time_stamp_date})),(extract_days(${sales.time_stamp_date})))`

**Date Now**

`date((extract_years(now())),(extract_months(now())),(extract_days(now())))`

**Difference in Days**

`diff_days((date((extract_years(now())),(extract_months(${sales.time_stamp_date})),(extract_days(${sales.time_stamp_date})))),(date((extract_years(now())),(extract_months(now())),(extract_days(now())))))`

**Is Earlier**

`diff_days((date((extract_years(now())),(extract_months(${sales.time_stamp_date})),(extract_days(${sales.time_stamp_date})))),(date((extract_years(now())),(extract_months(now())),(extract_days(now())))))>0`

14903

]]>Every single file I open the first thing I have to do is a 'Find, Replace' - why are all of the monetary figures exported with the following "¬£" (copied and pasted straight from Excel as it is constantly in my 'Find' box)? I'd rather have no denomination and rely on the column heading. Oh - and column headings that start with certain characters like "=" also do not play nicely in Excel.

If there's a setting somewhere I missed please let me know so I can save a little bit of time!

]]>The language of Custom Dimensions is Looker Syntax, which you can learn more about either in the Looker Syntax reference guide, or in our Calculations Home.

There are a few differences between how Custom Dimensions and Table Calculations work though. Here are some of the key differences

**1) Table Calculations require their relative data to be on your table. Custom Dimensions can read information that is not on your data table.**

Let's say I want to gather my top-level categories into Cohorts of multiple Top-Level Categories. I could use the Same formula to create the Cohorts, either in a Table Calculation, or a Custom Dimension

The visible results are the same...

But if I remove the original Top-Level Category from the table, what happens?

My Table Calculation won't work any more, but my Custom Dimension does.

**2) We can Pivot and Filter on Custom Dimensions**

Table Calculations can only be defined one column at a time. Our Custom Dimensions can pivot so that the data follows the rules that we set automatically...

Also, we can use our Custom Dimensions as a Filter...

**3) Custom Dimension Data can exceed the displayed row limits, and continue to sort properly.**

When we have more data than the available rows can display, we will see the yellow "Row limit reached" bar...

Now if we try to sort using Table Calculations we will get an error...

...but if sorting by Custom Dimensions will continue to work as any other Dimension!

**4) Custom Dimensions can't look at Measures, Table Calculations, or Position on the table.**

The formula builder will only see Dimension-specific data. If we look for Sales Totals in the "Edit Custom Dimension" tool...

...it won't even know where to look. So some functions may still require you to use Table Calculations.

But we'll find some neat ways to use Custom Dimensions to make more precise reports.

Not only can calculations be used to add content to your reports, now they can be used to create Custom Dimensions on your reports too.

Here is a basic overview of some changes to how Calculations will work going forward.

In the earlier versions of Analytics, Calculations were launched from the "Calculations" button on the "Data" bar...

But, now they will live under the section of "Custom Fields" above the Measures and Dimensions on your reports.

To launch a new Calculation, click on New> Table Calculation:

...and you will get the usual Calculation field...

instead of being stacked on top of each other. So if you want to add more, simply go back to the "New" button...

to make a copy of an existing calculation...

The rest of the rules of functions and Calculations remain the same.

but,

there are new tools: Custom Dimensions and Custom Measures will make it easier to design, build, and sustain custom reports that you need. Stay tuned for some new tips and tricks that will make it easier to build the reports that you need.

Sometimes, we may want to return on a report the date that a product was last sold.

This is not a difficult calculation to prepare from the Inventory Report, because "Days Since Sold" is a measure that is included, for example, on the Dusty Inventory report.

From our collection of Date Functions, we can use the Add_Days function to count backwards from the current day.

Remember that there are two arguments to Add_Date:

- The
*first*argument is the number difference. - The
*second*argument is the date that you wish to adjust.

It will look something like this:

*( difference, start date)*

So, for our function, we want it to look like:

( negative days since sold, today)

So to start, let's start a new Table Calculation, and we'll look for the add_days function...

and we'll look for *negative* last day sold...

Now, if we use now() as our source date...

We get a timestamp with our days since last sold (this will not be the actual timestamp of the sale. Rather, it is the reflective timestamp of the current time.

So, to make this cleaner, as the second part of our function, we could convert now into a date, use the Date function.

Remember, the Date function requires three arguments: ( year, month, day), so to prepare the function, I'm going to add three sets of brackets, separated by commas

So we want to find the year, the month, and the day from *now. *To do this, we could use the extract_year, extract_month, and extract_day functions. I will drop one of these between each set of brackets...

...and then, what do we want to extract these times from? Well, from *now*. So I'm going to add the now() function into all three empty brackets...

So now, when we save,

We just get a simple date!

*Spoiler alert, the final calculation we used is:

*add_days(-${cl_item_facts.days_since_sold},date((extract_years(now())),(extract_months(now())),(extract_days(now()))))*

]]>

Can someone please point me as to where I should go to assist my CFO?

]]>

Sometimes we can build in-table summaries of the results that Analytics gives us.

Let's say that we want to summarize results by

Let's say that we started with the First Subcategory report that we customized earlier, where we start with the complete category, but then remove everything following the subcategory

We could prepare data to gather totals by First Subcategory, but we would need Calculations to do this, as we cannot use a calculated field as a dimension.

These are the logical steps we need to make this happen:

- Sort the results by First Subcategory

- Determine which is the last instance of each First Subcategory

- Count How many rows in which the same First Subcategory can be found

- Gather the results from each collection of First Subcategories

- Gather the First Subcategories and their respective results.

The crux calculation that we are preparing is Offset List, one of the tools from our Table Functions.

Ready to start? Cool

Sort by First Subcategory

This should be easy enough, we can do this by just clicking on the "Item Category" label, and this should sort everything accordingly

So far so good

Is this First Subcategory the same as the next one?

This will help us identify the

Let's start now using a simple Offset function. I'm going to use Offset to look at the "First Subcategory" field

and to ask it to look one row down...

so when I save, to the right of each First Subcategory, we should see the next First Subcategory...

Then, what we want to see if they are the same. We can do this with one more simple calculation: =

If the values are equal, Analytics will return a Yes, if not, Analytics will return a No.

So we'll start with brackets...

...look for the "First Subcategory" calculation...

...copy the offset function looking at the row below...

...and then paste it into the second set of brackets...

...so that when we save...

We get a Yes/No calculation where the "No" identifies the final instance of every First Subcategory.

Let's save our work so far as new...

How many rows are the same?

So now we know the last instance of each First Subcategory, but we also will need a count of each one.

A way to start this involves the Match function. Match returns the first row on which a value occurs.

What we will do is match the First Subcategory...

...to the first subcategory...

...so when I save, we will get a Calculation repeating the first row on which each First Subcategory occurred...

Can you guess now how we can count the number of rows each First Category occupies?

Well, now we'll start a Calculation where we return the row()...

...and then subtract...

..the matched reference...

...so that our calculation will look like...

...upon saving...

...we will get a 0 for each First instance of a First Subcategory, and a subsequent count on each row of the subcategory.

Now let's save our changes...

Create a first and last reference point

The Offset List function we're going to use will need three arguments,

- the values (which will be our Sales Line totals)

- our start point for counting (or, how far back we want to start)

- the end point for counting (or, how many rows we wish to include)

Our values are already on the table.

Our start point now will be the negative row count for each last instance of a First Subcategory. So this becomes a pretty simple IF statement.

I'm going to start with my brackets...

...then I'm going to copy the calculation looking at the reference being the same or not...

...and paste, then grab the second function looking at the Count, copying...

...and pasting...

...now when we save, for each "No", we should see the negative total, for each "Yes" we should see zero...

Now, we want to define the end point. So this will need to be the row count plus one. Why you say? Because each count starts with zero.

We can copy+paste the same basic function above...

but substitute the negative sign...

...with a positive 1...

...so upon saving...

We get yet another calculation with the opposite inverse number plus one...

Let's save our work...

Gather the results

In case you were wondering. This is where the fun starts.

Now we get to roll out the offset_list function that we mentioned above.

First, what value are we looking at? I'm going to reference the sales totals...

...starting with our "Back" Calculation...

...ending with our "Forward" Calculation...

...so our final calculation looks like...

...and when we save...

wayyyyy over to the right, we'll see a list of all the Sale Line totals. Each list should only be displayed on the last instance of each First Subcategory, and should reference only its respective totals...

Now, one more step, let's create a sum...

...of that group total...

...so that when we save, where applicable, we get one total per First Subcategory...

Now let's save our work to this step...

A bit of cleanup

So there are some supportive calculations here that we shouldn't need on this report. We are using First Subcategory, Back, Forward, and Group Total, we're also going to use "Same" at least once or twice more. Everything else we should be good to drop...

So our lighter report to this point should look a bit like...

Look for the Last Results

So now, we want to prime our report for gathering the meaningful data.

To start, I'm going to ask Analytics to return a 1 for every instance where "Same" is No, and return 0 otherwise.

So we'll start our calculation with brackets...

and then copy+paste our calculation checking whether the First Subcategory is the same as the one below...

When I save this, we should see a 0 for each Yes, and a 1 for each No

Now what I want to do, is have these ones rise in sequence. How do we do that? Let's use our running_total function to return that number...

copying...

...pasting...

...and saving...

now the next fun part, we are going to match the row of the report...

...to this rising number...

...pasting...

...so that when we save...

Each row will return where the next First Subcategory can be found

See where we're going with this yet? We're almost there.

Let's save.

Gather Labels and Totals

Now, to complete the preparation of our data, we're going to use the Index function...

To look at the First Subcategory labels that we created...

...then from them, gather the indexed number...

copying, pasting...

...and saving...

creating a singular entry for all our First Subcategories...

finally, let's do the same for the grouped totals we created in Step Five...

so that when we save...

We get...

well, we get a messy table of unclear numbers.

We're almost there. I promise.

Hide the Raw Data

If we open the Table Visualization now, we should see all the starter data...

However, only two of these columns are now of any interest to us...

So let's start hiding everything...

...except...

...our labels...

...and the grouped totals...

So now, when we look at the Visualization...

We get a clean list of all the summary totals by First Subcategory!

Gosh, if you made it this far, you're awesome. Treat yourself to something nice.

Calculations Home

]]>

I noticed there is a tip referencing encoding of reserved characters, I have try the following urls and get an 'Invalid Argument' error for this url:

`https://api.lightspeedapp.com/API/Account/accountID/Sale.json?load_relations=%5B"SaleLines","SaleLines.item"%5D&shopID=3`

and the same for this url:`https://api.lightspeedapp.com/API/Account/accountID/Sale.json?load_relations=["SaleLines","SaleLines.item"]&shopID=3`

At the end of the day I need to grab sales data that includes:

Hours worked by D/W/M

Average Sales D/W/M

Units Per Sale D/W/M

Total Sales D/W/M

Store Total By D/W/MTD

Average Sales D/W/MTD

Units Per Sale D/W/MTD

Sales By Hour MTD/YTD

Sales By Day MTD/YTD

Store Margin D/M/YTD

Top 10 Products MTD/YTD

Sometimes we have received requests for Analytics to report on what payment methods were used on specific Product Categories or specific Products. This type of analysis is not

A Customer purchased a suit for $100, and a hat for $50. The customer's total was $150. The customer gave the store $60 in Cash, and then paid the remaining $90 on Credit Card.

If there are multiple categories

However, there are a few easier scenarios:

- If one product was purchased and one payment method was used.
- If products in five categories were purchased and
*one*payment method was used. - If multiple products in
*one*category were purchased and multiple payment methods were used.

These scenarios are all easy to account for as there is either parity between payment methods and

categories, or an even distribution of one

Because not every sale will be one of the easier scenarios, Analytics cannot

But we

This will involve coming up with a logical rule for distributing values from multiple payments across multiple categories.

Fortunately, there is a simple one that we can start with.

We could take the total funds and determine the percentage that each method contributed to the total.

For example.

In the above model, the payment methods totalled $150.

The Cash made up 60 of 150 dollars,

The Credit Card made up 90 of the 150 dollars,

So we

This way, we

$20 Cash for the Hat

$30 Credit Card for the Hat

$40 Cash for the Suit

$60 Credit Card for the Suit

Is it definitely what happened? No. But it works, and is enough for us to start building a simple working model.

In addition to this, we can use Analytics to start preparing our data.

To determine how much each payment contributed to each invoice, let's start by opening up the Sale Payments report

The default dimension is by Sale Date, so let's start by adding "Sale ID" as a dimension...

...and let's also drop the dimension of Sale Date, this will be of no value in our analysis...

...also, the default report is looking at the previous 1 complete month. Just so that our report isn't enormous, let's open the filters and switch this up to being the previous 1 complete week...

Now, we're not actually interested in the payment

So, we could add a calculation to divide the payment amount...

...by the row total of the payment amount...

and saving...

This will give us a calculation that mostly returns the number 1...

but we may find some instances where the payment is broken up across columns...

Let's go back to our calculation, and perhaps define it as being a percentage with four decimals...

...I'm also giving it a title of "Distribution"

And let's save our working Payments Report...

Finally, we don't need to see the value of payments on the report here, so let's hide the amount column...

So we have a visualized report that shows us by Sale ID only the percentage of payment that each method provided...

Let's save again as new...

...and then download our report as an Excel file...

Part Four: Defining Categories per Invoice

So now we can see the payment methods on each Invoice, we can run a similar Sales report to show us Top-Level Categories by Invoice as well. To start, let's open up the Recent Sales report...

And let's add a few dimensions, first, the Item Top Level Category...

...and then the Sale ID...

...and we can also get rid of the "Sale Date" field...

Also, let's make sure our date filter matches: Sales from the Past 1 complete week...

So when we run this report, we can see some sales have multiple Top-Level Categories on them...

So this is ready now, let's Save this with a custom title...

and again, we can download it as an Excel file...

So now, we can open our "Payments" and "Categories" reports in Excel...

Our Payments Report lists each Method per Invoice as a Column...

and our Categories report lists each Category per Invoice as its own line...

First, we want all the data on one sheet, so I'm going to copy the columns from the "Category" report...

...then paste them all on the "Payments" report to the right of all our Payment data...

...Sometimes adding a bit of colour makes things easier...

...So now, we want to determine how much of each Category's total was assigned to a certain payment method...

To do this, we need to match+index each Category to its respective distribution of methods...

Let's start by using the Excel "Match" function to find where the Category's Sale ID can be found in the list of payments.

On my excel sheet, the "Category" Sale ID is in Column N, the "Payment" Sale ID is in Column B, so my Excel formula looks like this in cell R2:

...and it shows me that the corresponding Sale ID is in Row 3...

...so far so good, let's drag or copy+paste the formula all the way to the bottom of the category data...

and it's looking good.

Next, we want to Index from each of the payment columns the corresponding percentage. We will want a different column for each payment method, so in Cell T2, I will use the following function

This way, when I drag the formula to the right for each payment method, the payment method column will change, but the Index source (Column R), will stay locked in place.

So when we drag the formula to the end of the data, we get a corresponding percentage value for each Category on each Invoice...

We're almost done!

Now, what we want to do is multiply each category's total against its respective payment amounts.

Most of the payment amounts we see are 100% or 0, although we're seeing some split payments on the list.

So we'll create one final calculation, this time in column AE, multiplying the Value in Column P by the percentages, starting with column T

So, my Excel formula in cell AE2 is

This way, when I drag my formula over, it will follow the dollar amount from Column P, and apply the percentage amount from all the payment columns: T-AB

Like so...

So now, to finish preparing our data, all we need to do is reference our Product Categories one final time...

That we can find in Column O...

So now, we have a full set of data for Categories with Payment Methods distributed proportionally among them per sale, we can complete our analysis using a pivot table...

a sumif function...

Or any other way you wish!

Discerning minds may notice that the totals from the Payments and Sales reports may not match. This does not

Calculations Home]]>

As I mentioned in the discussion on best practices, asking for help is one of the best ways to find the best approach for calculations. This process of turning Dusty Inventory into a Pricing Rule is the result of a collaborative effort with my colleague Bryn Harris.

Let's say we're looking at our Dusty Inventory report, but we want to take some actions within Retail to discount the prices or add the Products to a Price Rule.

We could use Calculations to either Define a new Price, or to add a tag to all the Products to quickly identify them in Retail for further actions.

In this discussion, we'll look at creating tags for adding to Products in Retail.

How you define tags in Retail depends on what their desired workflow application is. I may want to search Retail for all Products that are "dusty" or I may want to find products that were dusty at a certain time. Let's start by creating a default "dusty" tag on the Dusty Inventory Report...

We can start with a simple version of the tag with no calculations, just the word "dusty" with a comma in between quotation marks...

this way, when I save my calculation, I get the simple "Dusty," tag on every product...

but let's say we also want some more detail in this tag, perhaps the date when the product is considered dusty...

Tags must be one string of characters with no spaces, let's say that today is October 22nd, 2018. A good detailed tag might be "dusty20181022" or "dusty2018oct22"

From our cell-based functions, we could use the "Concat" function to create a tag with the word "dusty", and then a calculation to create the date content in the tag.

Again, let's start just by creating the word "dusty" between quotation marks (this time, without the comma"

So that our report looks like...

and then, from our date-based functions, we could also add the "now" function to start creating the date components of the tag...

when we add "now", it returns the complete time-stamp...

which may be a bit much for our tag. So, also from the date-based functions, we could use the functions, extract_years, extract_month, and extract_days to pull just the content we're looking for...

So we start by entering just the formulae...

copying the calculation for "now"

and then pasting it between the brackets in the respective extract functions...

not forgetting to label everything...

So when we save, our report looks like...

and following our best practices, let's save the work that we have done here so far...

Now the dates are best if they are in sequence, so it's a good idea to create the dates as a number. To do this let's multiply the extracted year by 10,000, and multiply the extracted month by 100. Starting with the base formulae...

then copy+pasting the earlier functions...

Now when we save, the year, month and date...

can be added as one number. Pulling the three sources...

...copying...

...pasting...

..and fixing embarrassing errors, like extracting the hour and calling it the day...

...finally creating...

...so that when we save...

...we get one number, YYYYMMDD that will work whether the month or day has one digit or two...

And let's save our work so far As New...

Now a lot of the starter calculations here are kind of redundant, I'm going to remove "now"...

and all the "extract" functions...

So we just have these three calculations...

So now, we just need one final calculation to create the final tag: dusty+20181022

We can use "Concat" to do this.

Starting with the function and brackets...

...then copying the content, first "dusty"

and pasting...

...then our date number, copying...

...and pasting...

So now when we save...

...we get one tag combining the date and the dusty status.

Once again, let's save as new...

Now the Retail Import tool will let us import-add multiple tags onto products

First, if the tags are separated by commas, and secondly, if the title of the column is "Add Tags"

So we could add in our "Concat" function the first "dusty" tag, eg..

Let's also swap out the Product Description...

for the System ID

This will make mapping our new tags to our Retail products much easier

Finally, let's get rid of all but the last calculation...

So that all we have remaining is the System ID, and the new tags

and once again let's save as new...

Finally in Analytics, we can download our file...

...as a CSV...

To make the export file something that works in Retail, there are a few adjustments we want to make. To do this, I'm going to open up my CSV in Excel...

First, we want to get rid of the line numbers in column A. So let's select Column A...

...right-click or control-click, and delete...

also, our System IDs have shown up in Scientific notation. To fix this, let's select them all in Column A

...right-click or control-click, and select "format cells"

and we're going to save as a number with zero decimal places...

So our data now looks like...

And it's ready to import!

start a new Import...

...select our new file...

...and ask Retail only to update existing products...

If our data is prepared properly, we will get a preview showing us how the mapping will take place...

...and we can import our sheet...

...so that the new tags are added to the products in Retail.

We can test this now, if we go to Inventory> Item Search...

We can search for products by tag...

...and see our results...

Now we can do the same in Quick Edit Items, in Price Rules, in Reporting, and in other Retail tools!

9940

]]>

Let's say that we're using Analytics to follow Inventory and define action points. Maybe we used it to build the Dynamic Reorder Points report....

Now that we can measure what Products need to be reordered, and how many need to be reordered, we can have Analytics prepare the data for us to Import as a PO into Lightspeed.

First, we need to add a Product identifier dimension. The best to use is "System ID", as this is a unique ID in Lightspeed that is never duplicated...

Let's run the report again to include the System ID...

Now our report looks like...

So we have the data that we need to import a PO, but we need to make a few changes.

First of all, not

What we could do is use an IF statement to nullify Products that do not need to be reordered...

For example, if we are not reordering at least one of a certain product, make its System ID null...

Following our best practices, let's start our calculation with brackets...

...then copy the calculation from our Dynamic Reorder Points value (or whatever formula you may have designed to define your own reorder points)...

then paste it into our IF criteria...

saying that if this value is less than one, return a null cell. If not...

Return the System ID (which Analytics reads as

So when we save...

We get a column on the right hand side that stops returning System IDs once the reorder amount is less than one.

Now, we could use the same calculation to return the amount to reorder if that amount is not less than one...

I'm going to title this calculation "Order Qty" as Lightspeed will automatically recognize this column header when we import this spreadsheet...

And again we'll copy the formula for determining the amount to reorder...

Paste it into the criteria...

and paste it into the results if not less than one...

So that when we save...

So now, let's save our work as a new report...

From here, all our report needs is cosmetic.

Let's start by hiding the dimensions and measures from visualization...

Because we only need them if their result is greater than 1...

let's also remove all except the last two calculations...

Their formulae are implicit in the final two calculations, so we can safely remove them from the table now...

And again, let's save this report as new...

From here, we can download the report...

and we'll select the option of CSV "with visualization options applied"

If we want to see what the downloaded file looks like, we can open it here...

Now, we can go to Retail, and create a new PO...

and when we save...

We'll see the option to Import...

...from here, we can select the file that we downloaded from Analytics, and then pull its products and order quantities onto the order in Retail!

]]>

if i want to look up a product by name, i use this PHP to generate the API url string:

"Item.json?limit=10&description=~,".rawurlencode("%".$search."%");

(i tried plain 'urlencode()' and double encoding 'urlencode(urlencode())' too)

when $search includes an ampersand &, no products are found.

i.e. search for 'B&W Speaker' does not give me a result, but 'W Speaker' does.

any ideas in how to encode a string with & in it properly?

thx!

maarten]]>