Exporting data to a standalone sql database

dbgaragedaysdbgaragedays Posts: 3Member

I have seen a few topics here where people are running nightly batch jobs that export ALL of their pos data to a separate (local) database. And then create and run reports against the local db.

i want to create this data dump but am new to this and the api.

Is there a complete documentation on this with code examples and api examples

Can someone point me in the right direction.?


Thanks

Jason

2 comments

  • Adrian SamuelAdrian Samuel Posts: 521Moderator, Lightspeed Staff moderator

    Hey @dbgaragedays,


    We don't have application tutorials but we'd be happy to point you in the right direction with how and what kind of queries you'll need to make to specific API endpoints.

    First you'll need to register and follow the steps of authentication. At the end of it you should have an expiring token which will need to be refreshed every 30 mins. That token however enables you to make requests to the authorised account.


    To get sales information, you need to gather your sales data from the Sale endpoint

    More can be read about that endpoint here: https://developers.lightspeedhq.com/retail/endpoints/Sale/

    You might also want to look at loading the SaleLines endpoint to get the lines of each sale.

    A typical request might look like this:

    https://api.lightspeedapp.com/API/Account/Sale.json?load_relations=["SaleLines"]

    Since you're looking to do daily jobs you want to filter for a specific timestamp, more information about that can be read about here: https://developers.lightspeedhq.com/retail/introduction/parameters/

    You want to remember that the timestamps are returned in UTC time so you'll want to offset the timestamp you're querying for by the local timezone of the store.

    You'll also want to remember that Lightspeed returns a maximum of 100 objects/results per query, and so you'll need to iterate and increase the offset to get all your results; you can achieve this by doing a loop in your code.


    With your choice of programming language, you should be able to achieve this, and then you want to either bulk insert/loop insert the data you've structured into a DB and then at the end run a SQL query to get the data out of that.


    Hope this helps!

    Adrian Samuel

    API Integrations Consultant - Strategic Solutions

    Lightspeed HQ

  • gregaricangregarican Posts: 308Member 

    One things that helped with some of the lifting was this library --> https://www.nuget.org/packages/lightspeedAccess/. If you are using or plan to use .NET you might find it helpful. For my application I take the JSON API response I get back from Lightspeed, convert it into a DataSet, and then iterate over the DataTables to load them into SQL Server. That routine just becomes a generic method that I can apply against whichever API endpoint I'm querying.


    Good luck!

Sign In or Register to comment.