Loading ...

Data Types and table prefixes in QQube | CLEARIFY - Page 1

Posted in: QQube    General QQube Usability

Data Types

Subscribe to RSS
  • We've been using QQube for a couple of years successfully and are currently working to embed Bi reports generated from Quickbooks into Dynamics365.  We have successfully made the connection on a couple of reports.

    Now we realize that by modeling our own dataset it would make the presentation of data in D365 much easier.  I have looked at the SQL Anywhere DB structure and have the following 2 questions:

     

    1.) Can someone point me to the differences based on the prefix on a table?  vd_ / vdf_ /vf_ /vl_

    2.) Is anyone aware of a connector for SQL Anywhere to Microsoft Dataverse?  If I could load the QQube data I could create the views in D365 bypassing the need for BI and dataset refreshes.

    #1 above is the most pressing

    Thanks

    • vf_ = FACT
    • vd_ = DIMENSION
    • vdf_ = DIMENSION (those items used to be part of FACT tables - changed in versions 6.x and later
    • vl_ = LOOKUP  meaning they are only for looking up a list, and don't attach to anything.

    To connect to our database, all you need to use is the two ODBC DSN's.  Information on those is found at the bottom of this guide: QQube for QuickBooks 10.x Installed Components | CLEARIFY 

    With regard to the schema, I would open up a Microsoft Access or Power BI, or PowerPivot example.  Very simple:  one FACT table, with multiple DIMENSIONS.

    Connection naming is very easy:  FACT has LinkToClassID, and DIMENSION has LInkForClassID

    And of course, you don't tie one FACT table to another e.g. one analytic to another as everything is already self-contained.    For instance, inventory already contains sales, purchases, g/l detail.  Read more here: Data Model Selections | CLEARIFY 

    I would also peruse this section which should give you detailed information on the underpinnings of QQube, and how you should approach your wishes here: Advanced Users and Power BI Developers | CLEARIFY 

  • Thank you for the info above on #1.  It is what we suspected as it is close to Star Schema but could not locate a definitive answer in your documents.

      Again we've been using your product for awhile with great success and understand the ODBC connections.  With that said we do not want to use ODBC to bring in the data into M$ Dataverse but would prefer to connect directly via a dataflow using a connector to populate the data into the D365 environment.  I wasn't sure if anyone knew of a connector for SQL Anywhere / Sybase that could handle this.

    If we have to we can do a single dataset and then bring it to M$ Dataverse in a separate step but would prefer to remove any interim steps if possible.  Reviewing your samples several query names would need to be changed so that there was not confusion in measures when the query names are the same but based on different data.  We can then use Tabular Editor to pull all of the existing measures that we might want unless you have a better way to move those.

    Here are the items and how they would need to be changed I believe (easy enough if we can't go straight to M$ DV.

    1. COMMON
      1. vd_Account (Account)
      2. vd_BinLocation (Bin Location)
      3. vd_FiscalYrTxnDate (Calendar - Fiscal Date)
      4. vd_Class (Class)
      5. vd_Company (Company)
      6. vd_Customer (Customer)
      7. vd_item (Item)
      8. vd_job (Job)
      9. vd_site (Site)
      10. vd_SourceName (Source Name)
      11. vd_UofM (UofM)
      12. vd_Vendor (Vendor)

     

    1. PURCHASE
      1. vf_PURCHASINGDETAIL (PURCHASING VALUES / Purchase - Values)
      2. vdf_PURCHASING_DOC_ATTRIB(Document Attributes / Purchase - Doc Attribs)
      3. vdf_PURCHASING_DOC_DATES(Document Dates / Purchase - Doc Dates)
      4. vdf_PURCHASING_GEOGRAPHY(Geography / Purchase - Geograpy)
      5. vdf_PURCHASING_LINE_ATTRIB (Line Attributes / Purchase - Line Attribs)

     

    1. INVENTORY
      1. vf_INVENTORY (INVENTORY VALUES / Inventory - Values)
      2. vd_AccountInventory (Account - Inventory / Inventory - Account)
      3. vd_AccountSales (Account - Sales / Sales - Account)
      4. vdf_INVENTORY_DOC_ATTRIB(Document Attributes / Inventory - Doc Attribs)
      5. vdf_INVENTORY_DOC_DATES(Document Dates / Inventory - Doc Dates)
      6. vdf_INVENTORY_LINE_ATTRIB (Line Attributes / Inventory - Line Attribs)

     

    1. SALES
      1. vf_SALESDETAIL (SALES VALUES / Sales - Values)
      2. vd_CalYrTxnDate (Calendar - Sales Transaction Date)
      3. vd_SalesRep (Sales Rep)
      4. vdf_SALES_DOC_ATTRIB (Document Attributes / Sales - Doc Attribs)
      5. vdf_SALES_DOC_DATES (Document Dates / Sales - Doc Dates)
      6. vdf_SALES_GEOGRAPHY (Geography / Sales - Geography)
      7. vdf_SALES_LINE_DATES (Line Dates / Sales - Line Dates)
      8. vdf_SALES_LINE_ATTRIB (Line Attributes / Sales - Line Attribs)
      9.  
    2. OPEN PO
      1. vf_OPENPO (OPEN PURCHASE ORDER VALUE / PO - Open Value
      2. vd_CalYrExpectedDate (Calendar - PO Expected Date)
      3. vdf_OPENPO_DOC_ATTRIB (Document Attributes / PO - Open Doc Attribs)
      4. vdf_OPENPO_DOC_DATES (Document Dates / PO - Open Doc Dates)
      5. vdf_OPENPO_GEOGRAPHY (Geography / PO - Open Geography)
      6. vdf_OPENPO_LINE_ATTRIB (Line Attributes / PO - Open Line Attribs)
      7. vdf_OPENPO_LINE_DATES (Line Dates / PO - Open Line Dates)
    3. OPEN SO
      1. vf_OPENSO (OPEN SALES ORDER VALUES / SO - Open Value
      2. vd_CalYrShipDate (Calendar - SO Ship Date)
      3. vd_CustomerRep (Customer Rep / Rep - Customer)
      4. vd_JobRep (Job Rep / Rep - Job)
      5. vdf_OPENSO_DOC_ATTRIB (Document Attributes / SO - Open Doc Attribs)
      6. vdf_OPENSO_DOC_DATES (Document Dates / SO - Open Doc Dates)
      7. vdf_OPENSO_GEOGRAPHY (Geography / SO - Open Geography)
      8. vdf_OPENSO_LINE_ATTRIB (Line Attributes / SO - Open Line Attribs)
  • and

     

    1. AR
      1. vf_OpenAR (ACCOUNTS RECEIVABLE VALUES / AR - Values)
      2. vdf_OPENAR_DOC_ATTRIB (Document Attributes / AR - Doc Attribs)
      3. vdf_OPENAR_DOC_DATES (Document Dates / AR - Doc Dates)
      4. vdf_OPENAR_LINE_ATTRIB (Line Attributes / AR - Line Attribs)
      5. vdf_OPENAR_LINE_DATES (Line Dates / AR - Line Dates)
      6. vdf_OPENAR_GEOGRAPHY (Geography / AR - Geography)
      7. vd_SalesTaxItem (Sales Tax Item / Item - Sales Tax)
      8. vd_Terms (Terms)
      9. vf_FINANCIALTBACLASS (TRIAL BALANCE ACTIVITY BY CLASS / Trial Bal - Class)
      10. $Account? (REDUNDANT?)
      11. $Class? (REDUNDANT?)
      12. $Company? (REDUNDANT?)
  • With all due respect, you actually made it too tough on your self - in several areas. I am sorry that you spent the time to delineate your details.  I am also sorry that we were not emphatic or clear in our first response.

    Just open up the Microsoft Access examples, and you will see ALL the connections for each analytic.  Then you won't have to guess or surmise what belongs where.   There is only one way to connect the data anyway.

    Yes we provide star schemas in the traditional data warehousing methodology.  One FACT table and multiple DIMENSIONS

    Second there is no redundancy anywhere.  You can't use the same dimension on several FACT tables e.g. account, company, class DIMENSIONS on profit and loss FACT and also Balance sheet FACT.  They have to be aliases.

    Second, with regard to "common dimension"  We may have certain fields in the vd_account for one analytic, but have a slightly different vd_accountBS for use with the Balance Sheet analytic.  Yes vd_company may be used on different analytics - but you would still need to create an alias. IF there were several FACT tables in your modeling.

    With regard to a direct connector, the ODBC DSN's are the only way to connect to the QQube database - and are the easiest and simplest form of connection anyway; permissions are properly affixed, and the passwords are listed in the document we gave you earlier.   

    What would be best is to create a linked server - if that is possible with the product you are connecting to.  That is how Microsoft Access works (or even SQL Server).  

    But use the Microsoft Access examples (and bring things over in the schema that already exist) - it will make life very easy for you.  There is absolutely no reason for you to re-invent the wheel.

  • Thank you for your response.  You were very clear in your initial response and I understand what you are saying in respect to ODBC being the easiest to connect and grab data for BI.  Unfortunately it is not the easiest when wanting to use the data in M$ Dataverse due to a limitation on creating relationships based on ODBC data.

    I was able to connect to the data via a sybase connector and pull everything that was needed.  It is populating a BI report correctly and I can validate the data against a report with ODBC as the source.  I simply need to add the connector to a dataflow in M$ Power Platform and I can use the native data in D365 views instead of embedding a linked Power BI report.  Also by going this route we can update D365 fields and create D365 calculated fields based on the data from QB.

    ODBC would force the following path for the data to arrive in an embedded view

    QQube > BI DataSet > BI Report > D365 Embedded BI Report

     

    Direct DB access via Sybase connector

    QQube > M$ Dataverse (data that can be filtered live via system and local views in D365).

     

    2 steps with "live" data versus 4 steps with a "snapshot" is worth the effort on this.  

     

    If you want the steps we took to make this work let us know and we'll DM them to you.

     

    thanks for the info above as it helped clear up some questions we had

  • One thing to add.  I may be reading between the lines, but it appears that you are trying to create one FACT table with all measures from all analytics.  Just not possible - for any number of reasons.

    First, the SDK doesn't make that possible.  End of story.

    But also realize that each analytic is specific unto itself - and doesn't bleed into other analytics.  

    You may have certain transactions used in one analytic and not another.  Or logic being used in a particular situation, that does not match QB logic in another analytic.

    Each field name is unique within the FACT tables.  For instance you might see our online documentation show "Line Sales Amount" in both the Sales Analytic and Inventory Analytic. 

    When you pull across the data from QQube, they will have different names.  For purposes of ease of use to the end user, we have field aliases in the Excel Add-In, or PowerPivot, or Power BI.

    Second, the logic for the Sales Analytic is different than what is used for the Inventory Analytic as Inventory only looks at specific items, but the sales looks at all sales items.

    There are specific reasons for creating analytics within a particular silo - they serve completely different needs.

  • Unfortunately we are not able to accommodate a direct connection to the database, other than OLE DB

    Does your platform not have an OLE DB connector? Then you could simply use this connection:

    "Provider=MSDASQL;DSN=QQubeFinancials;UID=QQubeFinancials;PWD=Financials;"

  • Noted - we actually are using the default names now that we have access to the DB.  One of the things that our team fought with over the past couple of years is the following scenario:

    JIT (Just in Time) Inventory

    We need to be able to calculate the following for an individual item:

    QTY - US Warehouse

    QTY - Transit (Sea Container ) with estimated arrival.  Not summary data but actual container arrival dates(s)

    QTY - Transit (Air) with estimated arrival.  Not summary data but actual arrival dates(s)

    QTY - Sales Order(s).  Not summary data but actual promised ship dates(s)

    QTY - Open PO. Not summary data but actual completion dates(s)

    QTY - Run Rate - Monthly projected amount based on prior months and projected next Quarter based on Next Quarter Prior Year (weekly amounts)

    etc, etc, etc

    We accomplished 95% of it with the predefined "silos" but when we need to calculate arrival dates and ship dates of individual transactions behind the summary that is where we're missing the next level of data.

    We just tested it in a BI with data in a consolidated set of queries and it works perfectly even giving us historical purchase amt$ to monitor price creep and tariffs.

     

    Great product!  Love QQube!

  • Answered

    We connected directly.  See below.  all good

     

  • James and Chuck,

    Thanks for posting this so all can see. It is an interesting discussion. 

Page 1 of 1 (11 items)