Dbt utils macros. See teradata_utils package for install instructions.
Dbt utils macros.
Utility functions for dbt projects.
Dbt utils macros This can save time and reduce errors compared to writing custom code. Follow edited Dec 27, 2022 at 13: 57. You can try to find a different method of Utility functions for dbt projects. Previous. Appreciate the help! Using an MD5 hash (the default for the dbt_utils. run_query or db_utils. Contribute to NicTheDataGuyUK/dbt-utils development by creating an account on GitHub. It accepts one mandatory parameter, which is the Utility functions for dbt projects. Standardized macros. This post will run through how to install and use some popular (and some Macros can also be sourced from packages, such as the dbt-utils package, which provides a collection of commonly used macros. `{{ model_name }}` is an ephemeral model. They are described this way: "These macros run a query and return the results of the query as objects. It used to only use varargs and now also allows a list. We received the following error: Found 106 models, 80 tests, 0 archives, 0 analyses, 201 mac Utilize existing macro libraries like dbt-utils to leverage community-driven solutions for common problems. While developing a model in 0. This macro allows for namespacing macros throughout a dbt project. Using the DATEDIFF macro, you can calculate the difference between two dates without having to worry about Helper utils for our packages. Viewed 5k times 0 . Contribute to ikartik88/dbt-utils-pkg development by creating an account on GitHub. See teradata_utils package for install instructions. These utilities help streamline various data transformation tasks, making it easier to write clean, efficient, and maintainable dbt models. When configuring your unit test, you can override the output of macros, project variables, or environment variables for a given unit test. ; primary_key_columns (required): A list of primary key column(s) used to join the queries together for comparison. e. Support for dbt-utils package . ; delimiter_text (required): Text representing the delimiter to split by. This package does not need to be specified as a dependency of your package in packages. String_to_number is a macro that I created and it’s doing the following : I reproduced your issue on my machine. Instant dev environments GitHub Copilot. In particular, I’m interested in using pivot and get_column_values but I have been unsuccessful The `{{ macro }}` macro cannot be used with ephemeral models, as it relies on the information schema. Optionally override the dispatch search locations for macros in certain namespaces. Seeds link. for exmaple: Contribute to dbt-labs/dbt-utils development by creating an account on GitHub. dbt Labs acquires SDF Labs to accelerate the dbt developer experience. If the value is negative, the parts are counted backward from the end of the string. It is a wrapper around the statement block, which is more flexible, but also more complicated to use. 17. For instance, the get_column_values macro from dbt dbt-utils is a package developed by dbt Labs that provides a collection of reusable macros and functions. To define your own generic tests, simply create a test block You signed in with another tab or window. Write better code with AI dbt-utils / macros / sql / please open a new post describing what you’re trying to achieve, including the code you’re trying to run Whichever way, it leads you to this nightmare: to use most dbt packages out there - even dbt-utils fully right now, you will have to fix the order by (at least) and the group by (if you feel like Batman), and live with the temporary ugliness of nested subqueries in some cases, also you would have to modify a bunch of macros, then use them as a local package that is The dbt-utils package has: SQL generators for effective data manipulation. Expected results. date_spine(datepart="day", start_date="cast('2019-01-01' as date)", end_date="cast('2020-01-01 Spider 2. It's important to note that with the advent of cross-database macros, certain functionalities have been migrated to be more universally applicable across Spider 2. For the varargs portion, it does the following: {%- for field in varargs %} {%- set _ = field_list_xf. Using the macro in a model, then adding dbt tests allows us to be confident in our code. The star macro is returning an empty string. This happens every now and again, whereby we've upstreamed some utility macros as necessary prerequisites for core dbt functionality. Definition . 12. hash('mycolumnname')}} just like your friends with Snowflake. generate_surrogate_key( [ NVL ( column1, string_to_number ( column2))] ) }} The reason behind this is that column2 is a number but in VARCHAR format ( this is how it is coming from the source system) E. And since we are using a pattern like “prefix_%” it happens The spark-utils package may be able to provide compatibility for your package, especially if your package leverages dbt-utils macros for cross-database compatibility. Consider making it a view or table instead. This macro: Learn how dbt-utils enhances dbt with pre-built macros, key features, and practical use cases. Unfortunately, the compiled code doesn’t give us any useful information to troubleshoot. my test cases is here: {% test get_customer_active %} expect query with expected as ( select 'Existing' as customerdesc, 0 as customer_status ) {{ get_customer('active') }} union all select * from expected; Utility functions for dbt projects running on Athena - dbt-athena/athena-utils Overriding package macros . date_spine that allows you to specify either start_date and end_date for your date spine, or specify a number of periods (n_dateparts) in the past from today. Following the second example above: Whenever I call my version of the concat macro in my own project, it will use my special null-handling version on Redshift. The macro is being called from the dbt model. Spider 2. Testing and Documentation. 2. 19. This dbt package contains macros that can be (re)used across dbt projects. dbt-utils offers a wide range of pre-built macros for common SQL operations using Jinja templating, such as joins, aggregations, filtering, and more. Find and fix vulnerabilities Actions. Create Models from DBT Macros. Please check out utils macros. To make use of these trino adaptations in your dbt dbt package for macros and custom materializations - tekliner/dbt-improvado-utils Passing list of Relation object to dbt_utils. group by 1,2,3) if you need to; in general, we recommend performing aggregations and joins in I am trying to use the dbt_utils. Why does this matter? Other macros in dbt-utils, such as surrogate_key, call the Contribute to dbt-labs/dbt-utils development by creating an account on GitHub. Usage to build a daily date dimension for the years 2015 to 2022: Utility functions for dbt projects. For more info Utility functions for dbt projects. This doesn't seem like a SQLFluff bug. Instant dev environments Issues. Improve this @Sam777 yes the github issue is for calling a macro in pre-hooks from the config property in yaml files. These include: 1. Seeds are csv files that are used Process Mining app templates come with a dbt package called pm_utils. The heavy lifting is done by the dbt macro provided in this post and the config table with a predefined structure. This package includes macros that are used in Fivetran's dbt packages. This macro adds a date/time interval to the supplied date/timestamp. 0: Evaluating Language Models on Real-World Enterprise Text-to-SQL Workflows - xlang-ai/Spider2 Hello Readers, I am trying to make macro in dbt that reads files from a specific database,schema and unions them. This happens every now and again, whereby we've Introspective Macros. Check dbt Hub for the latest installation instructions, or read the docs for more information on installing packages. There is currently a macro in the dbt_utils package that does something similar, called surrogate key. Data validation strategies. Sign in Product Actions. 84 x 10E19 aka a whole lot of data). Other more complex macros in dbt utils are introspective macros. Streamline your data transformations with our step-by-step guide. Wherever a custom trino macro exists, dbt_utils adapter dispatch will pass to trino_utils. The package provides a compatibility layer between dbt_utils and dbt-teradata. Starting with release 1. yml: Utility functions for dbt projects. dbt supports dbt_utils, a package of macros and tests that data folks can use to help them write more DRY code in their dbt project. Now, cross-database macros are available regardless if dbt utils is installed or not. 0: Evaluating Language Models on Real-World Enterprise Text-to-SQL Workflows - xlang-ai/Spider2 Spider 2. 0: Evaluating Language Models on Real-World Enterprise Text-to-SQL Workflows - xlang-ai/Spider2 A surrogate_key macro to the rescue Thanks to a handy function called generate_surrogate_key in the dbt_utils package, you can fire yourself from the business of wrapping your columns in coalesce every time you want to convert_data_type. After installing a package into your project, you can use any of the macros in your own project — make sure Explore the essential dbt-utils cheat sheet for dbt enthusiasts: Utility macros, tests, and SQL generators to optimize dbt projects. I am trying to define a list of strings to use in a macro but I want the list of strings to come from the results of a query. If you're looking to re-use a CTE, it should probably be its own model (a macro would be another choice). We are using an Amazon Redshift database. Package containing dbt macros to help generate salesforce formula fields synced from Fivetran. Manage code changes Discussions. I want to Try wrapping your column names in quotes when you call them in the macro - I think it’s trying to pass in the variables street and city (because you’re already inside of curly braces), which don’t exist so There are tons of generic data tests defined in open source packages, such as dbt-utils and dbt-expectations — the test you're looking for might already be here! If your generic test depends on complex macro logic, you may find it more convenient to define the macros and the generic test in the same file. You can use the ephemeral materialization and dbt won't Spider 2. Utility functions to support analytics over FHIR in BigQuery or Apache Spark - google/fhir-dbt-utils Utility functions for dbt projects. 0: Evaluating Language Models on Real-World Enterprise Text-to-SQL Workflows - xlang-ai/Spider2 Snowplow utility functions to be used in conjunction with the snowplow-web dbt package. The macro currently consists of two namespaces: With this pattern, we will generally use expression_is_true from dbt_utils. However, dbt-fabric offers some utils macros. 0: Evaluating Language Models on Real-World Enterprise Text-to-SQL Workflows - xlang-ai/Spider2 You signed in with another tab or window. star() to use it which is overly complicated? The simplest solution IMHO involves dbt-l Contribute to dbt-msft/dbt-sqlserver-utils development by creating an account on GitHub. I’m new to dbt and looking to how to better incorporate macros from dbt_utils into my workflow. Ask Question Asked 4 years, 1 month ago. Automate any workflow Codespaces. Testing macros is crucial for Or you can also add your own macro to your project, following the adapter_macro pattern we use in dbt-utils. can u try the below code to initialize the dataset and table_ {% set dataset = datasetraw %} {% set table_ = tablename %} Refer to Model contracts for more info. ) can be found in sqlfluff#2712. SQL union seems to fail for me. Below is an example of Jinja code calling the pm_utils. Thus we found the two macros in dbt_utils, that in combination makes the modeling much more handy than ever before. I regularly share helpful content over at Datacoves. - snowplow/dbt-snowplow-utils Spider 2. append(field) packages: - package: dbt-labs/dbt_utils version: 1. Blame. If not specified, dispatch will look in your root project first, by default, and then look for implementations in the package named by macro_namespace. One thing to note though is that there’s a trade-off to be aware of here: using macros can reduce the readability of your models, and quoting strings correctly can be confusing (see above), so be mindful of using them too much. For example, If the pub_table has 100 records and Contribute to fivetran/dbt_fivetran_utils development by creating an account on GitHub. For instance, the dbt_utils package includes the date_spine macro, but it doesn't work as expected within Azure Synapse. Contribute to dbt-labs/dbt-utils development by creating an account on GitHub. date_spine macro : select {{ dbt_utils. Instant dev environments dbt-sqlserver-utils / macros / sql / get_query_results_as_dict. Args:. The run_query macro provides a convenient way to run queries and fetch their results. A simplified example could be as follows. yml. Last updated on Jan 10, 2025. So as first step I am storing all the table names from information schema in snowflake which has all tables that starts with GENERAL%. Warning: the ` escape_single_quotes ` macro is now A simplified example could be as follows. ; part_number (required): Requested part of the split (1-based). I have a use case where I would like to define the name of a macro and then apply it to one column. More details (steps to reproduce, expected results, actual restuls, logs, system info, etc. string_text (required): Text to be split into parts. How to use dbt_utils. Arguments: The reason is that dbt_utils. Contribute to dbt-msft/dbt-sqlserver-utils development by creating an account on GitHub. Write better code with AI Security. Materializations. Modified 2 years, 1 month ago. dbt-utils package is supported through teradata/teradata_utils dbt package. Project Plan Available in detail on Trello dbt-postgres-utils - voting enabled! {{ dbt_utils. Next. 10. Helper utils for our packages. star() macro will print out the full list of columns in the table, but skip the ones I’ve listed in the except list, which allows me to perform the same logic while writing fewer lines of code. You can override the output of any Utility functions for dbt projects. Hi all, I’m starting out with Macros and created one which looks up a value in a table and returns the value. Automate any workflow Packages. Ensure that your macros are well-documented and tested. You switched accounts on another tab or window. Is this possible? For example the results from this query would be a list of veggie names select name from veggies I want to use the results of the query to define a list like below {% set veggies = ['carrots', 'potato', 'broccoli', 'corn'] %} I would then use the Note: Previously, dbt_utils, a package of macros and tests that data folks can use to help write more DRY code in their dbt project, powered cross-database macros. But the version of the concat macro within the dbt-utils package will not use my version. dbt-utils is a collection of pre-written macros that helps you with things like pivoting, writing generic tests, generating a data spine, and a lot more. But, because it isn't currently implemented in tsql-utils, you have to call dbt_utils. I’m able to use Snowflakes JSON notation in my models Hi all, I’m starting out with Macros and created one which looks up a value in a table and returns the value. This macro relies on a nested Common Table Expression (CTE), which isn't Spider 2. This could be done by someone familiar with the data structure but does not need to be a data professional. Stack Overflow. union_relations macro automatically performs a “UNION ALL” operation between each relation, therefore, does not allow distinct selection / removal of duplicates. current_timestamp() (August 2017, b8da01a) predates dbt. Your macro's definition has too much whitespace in the braces that define the jinja block: { % macro audit_tbl_insert (model_id_no, model_name_txt) % } Needs to be {% macro audit_tbl_insert (model_id_no, model_name_txt) %} Where to find dbt macros when wishing to edit it. Actual results. This macro mimics the utility of the dbt_utils version however for BigQuery it ensures that the timestamp difference is calculated, similar to the other DB engines which is not the case in the dbt_utils macro. The dbt_utils. Tables; Seeds; Snapshots; Indexes; Grants with auto provisioning; This dbt package contains Postgres-specific macros that can be (re)used across dbt projects. Edit this page. I have two macros defined that I want to call dynamical Skip to main content. Skip to content. dispatch: - macro_namespace: dbt_utils search_order: [athena_utils, dbt_utils] - macro_namespace: dbt_expectations search_order: [athena_utils, dbt_expectations] - macro_namespace: metrics search_order: [athena_utils, metrics] For dbt < v0. Steps to reproduce. a_relation and b_relation: The relations you want to compare. I’m passing in a Foreign Key ID however this key is stored in JSON. 0 with BigQuery. What basically happens is, that the complete lineage left of the union model is missing. append(field) This macro is a part of the recommended 2-step Cloning Pattern for dbt development, explained in detail here. Introspective macros for better data comprehension. expression_is_true For more info about the pm_utils macros, see ProcessMining-pm-utils. get_query_results_as_dict iterate over in a case/statment. star macro. Updated to support external tables in redshift - odikia/dbt-utils-redshift The dbt_utils. dimensions, which can be used to generate repetitive SQL code for selecting multiple fields. But if you define prehook in the model file it will work fine, for now u can go with the below approach dbt-utils provides a variety of features and functionalities that make data transformation more efficient and streamlined. All the client must do is populate that table. 1. dbt (and the dbt_utils macro package) helps us smooth out these wrinkles of writing SQL across data warehouses. I have two macros So, my question is, if there is a way to dynamically call macros in jinja/dbt? jinja2; dbt; Share. Your data may be loaded in a Describe the bug I created an issue in sqlfluff repo, because sqlfluff refuses to recognize dbt_utils macros. dateadd (datepart, interval, from_date_or_timestamp)}} From my digging around, it seems that there are some dbt_utils macros such as pivot() that are still relying on the version that's in dbt_utils instead of dbt Core. Microsoft Azure Synapse DWH configurations. get_filtered_columns_in_relation, or pass None and the macro will Spider 2. Macros _get_utils_namespaces . g. Overriding package macros . While very very very unlikely, it’s certainly something to consider for truly massive datasets. ; columns (optional): The columns present in the two queries you want to compare. You can try to find a different method of using the select DISTINCT operator that sqlfluff will be okay with, or, you can create a custom macro - union_relations dbt package for macros and custom materializations - tekliner/dbt-improvado-utils Contribute to fivetran/dbt_fivetran_utils development by creating an account on GitHub. Forming your surrogate keys with this macro has the benefit of elegant + DRY null handling. This is a simple example of using dbt macros to simplify and shorten your code, and dbt can get a lot more sophisticated as you learn more techniques. See more This dbt package contains macros that can be (re)used across dbt projects. I want to "shim" the dbt_utils package with the spark_utils compatibility package. star() which works without issue in TSQL. Join our virtual event: Data collaboration built on trust with dbt Explorer dbt-labs/dbt_utils. Microsoft SQL Server configurations. Would love to know if you find this resource helpful or if there are any other dbt areas you'd like a deep dive into! Utility functions for dbt projects. sql. You signed out in another tab or window. Cross DB macros . How do I run DBT models from a Python script Thanks for having the initiative @joellabes!. union_relations in snowflake? Hot Network Questions Nuclear Medicine Dose and Half-Life Rail splitter with LM324 How we know that Newton and Leibniz discovered calculus independently? The dbt_utils. dbt_utils. 1. 2 we have run into an issue with the dbt_utils. source freshness). Reload to refresh your session. We use the helpful macro and testing side of dbt-utils, and would "vote" to keep them together, or even move to dbt-core, if that's a possibility:. To make use of these trino adaptations in your dbt Utility functions for dbt projects. Using: dbt 0. After this reading this in for loop and using source function to create relation which can be fed to Meet new peers, ask questions, and share what you know. Build long lists with a few exclusions with dbt_utils. The problem I’m having The pre_hook fires the macro but the macro isn’t deleting the records from the pub_table The context of why I’m trying to do this For every Airflow DAG run, I would like to replace the records in the pub_table with the records from the int_table if there is a match with the field _meta_source_file. current_timestamp (Feb 2019, dbt-labs/dbt-core#1272, i. One of the macros dbt utils offers is the star generator. Improve this question. I am utilizing Snowflake as the Data Warehouse and this JSON sits in a Variant data type column. Instead of looking up the syntax each time you use it, you can just write it the same way each time, and the macro compiles it to run on your chosen warehouse: {{ dbt_utils. A number of useful macros have also been grouped together into packages — our most popular package is dbt-utils. You can try to find a different method of using the select DISTINCT operator that sqlfluff will be okay with, or, you can create a custom macro - union_relations_distinct! The reason is that dbt_utils. Sign in Product GitHub Copilot. My guess is, you're hitting this line in the macro, which returns empty string if the execute flag is not set. g : ‘0001’ and I want to have it as 1 . Instant dev environments dbt-sqlserver-utils / macros / schema_tests / Concatenate columns using a macro in DBT for Redshift. About; Products OverflowAI; Stack Overflow for Teams Where developers & technologists share private knowledge my question is, if there is a way to dynamically call macros in jinja/dbt? jinja2; dbt; Share. Contribute to fivetran/dbt_fivetran_utils development by creating an account on GitHub. lux7. star: If you're unit testing a model that uses the star macro, you must explicity set star to a list of columns. 0: Evaluating Language Models on Real-World Enterprise Text-to-SQL Workflows - xlang-ai/Spider2 A surrogate_key macro to the rescue Thanks to a handy function called generate_surrogate_key in the dbt_utils package, you can fire yourself from the business of wrapping your columns in coalesce every time you want to generate a surrogate key. asked Contribute to dbt-labs/dbt-utils development by creating an account on GitHub. 0: Evaluating Language Models on Real-World Enterprise Text-to-SQL Workflows - xlang-ai/Spider2 If there are no columns returned in the macro it returns as commented out block but the issue is if the star macro isn't the last selected field in a select statement, it will place a comma on the outside of the comment block. 3, some macros were migrated from teradata-dbt-utils dbt Utility functions for dbt projects. 1 Also, everything worked fine prior to my usage of dbt_utils so it's something to do with it. Note: The datepart argument is database-specific. Find and fix vulnerabilities Codespaces. generate_surrogate_key macro), you have a 50% of a collision when you get up to 2^64 records (1. Utility functions for dbt projects. The macro currently consists of two namespaces: I add tests folder in dbt_project. yml file and create a test cases store under the tests folder. . However, we just stumbled over an issue, that this breaks the lineage at least on Redshift. This macro splits a string of text using the supplied delimiter and returns the supplied part number consider dbt_utils. This pm-utils package contains utility functions and macros for Process Mining dbt projects. 2, add the following lines to your dbt_project. The use of such macros not only saves time but also ensures consistency across your SQL scripts. Navigation Menu Toggle navigation. Instant dev environments dbt-sqlserver-utils / macros / datetime / date_spine. 0. This is because the star only accepts a relation for the from argument; the unit test mock input data is injected directly into the model SQL, replacing the ref('') or source('') function, causing the star macro to fail unless overidden. dbt package for macros and custom materializations - tekliner/dbt-improvado-utils Contribute to dbt-labs/dbt-utils development by creating an account on GitHub. Installation Instructions. dbt doesn't parse your model, so it simply doesn't know what stg_example_table is. Running a dbt model that utilizes the pivot macro. union_relation macro fails. - fivetran/dbt_salesforce_formula_utils Wherever a custom trino macro exists, dbt_utils adapter dispatch will pass to trino_utils. No warnings from using a dbt_utils macro. unique_combination_of_columns; dbt_utils. Some tests we use: dbt_utils. This means you can just do {{dbt_utils. This macro clones the source database into the destination database and optionally grants ownership over its schemata and its schemata's tables and views to A wrapper around dbt_utils. 0: Evaluating Language Models on Real-World Enterprise Text-to-SQL Workflows - xlang-ai/Spider2 A few things to note about the GROUP BY implementation: It’s usually listed as one of the last rows in a query, after any joins or where statements; typically you’ll only see HAVING, ORDER BY, or LIMIT statements following it in a query; You can group by multiple fields (ex. -- example Contribute to dbt-msft/dbt-sqlserver-utils development by creating an account on GitHub. Instead, you should encourage anyone using your package on Teradata to: Install teradata_utils alongside your split_part . Plan and track work Code Review. Examples . 1 and 0. Do you have macros? Then you should go back to add validation tests even for simple code. How I am trying to model the following situation: given some query, return multi-column result-set (e. Use schema files to document the purpose, arguments, and usage of each macro. optional() macro. Instead, you should encourage anyone using your package on Apache Spark / Databricks to: Install spark_utils alongside your dbt-utils Not supported at this time. I’m able to use Snowflakes JSON notation in my models Another valuable macro is dbt_utils. Why does this matter? Other macros in dbt-utils, such as surrogate_key, call the Utility functions for dbt projects. The dbt Community is where analytics engineering lives and grows. Additional tests and utility macros that make it easier to generate complex SQL. Host and manage packages Security. They are typically abstractions About run_query macro. The teradata-utils package may be able to provide compatibility for your package, especially if your package leverages dbt-utils macros for cross-database compatibility.
gjrod ywhd jczm uiym pzktv gpbzu ovv rwxeu lfhe qriup
{"Title":"What is the best girl
name?","Description":"Wheel of girl
names","FontSize":7,"LabelsList":["Emma","Olivia","Isabel","Sophie","Charlotte","Mia","Amelia","Harper","Evelyn","Abigail","Emily","Elizabeth","Mila","Ella","Avery","Camilla","Aria","Scarlett","Victoria","Madison","Luna","Grace","Chloe","Penelope","Riley","Zoey","Nora","Lily","Eleanor","Hannah","Lillian","Addison","Aubrey","Ellie","Stella","Natalia","Zoe","Leah","Hazel","Aurora","Savannah","Brooklyn","Bella","Claire","Skylar","Lucy","Paisley","Everly","Anna","Caroline","Nova","Genesis","Emelia","Kennedy","Maya","Willow","Kinsley","Naomi","Sarah","Allison","Gabriella","Madelyn","Cora","Eva","Serenity","Autumn","Hailey","Gianna","Valentina","Eliana","Quinn","Nevaeh","Sadie","Linda","Alexa","Josephine","Emery","Julia","Delilah","Arianna","Vivian","Kaylee","Sophie","Brielle","Madeline","Hadley","Ibby","Sam","Madie","Maria","Amanda","Ayaana","Rachel","Ashley","Alyssa","Keara","Rihanna","Brianna","Kassandra","Laura","Summer","Chelsea","Megan","Jordan"],"Style":{"_id":null,"Type":0,"Colors":["#f44336","#710d06","#9c27b0","#3e1046","#03a9f4","#014462","#009688","#003c36","#8bc34a","#38511b","#ffeb3b","#7e7100","#ff9800","#663d00","#607d8b","#263238","#e91e63","#600927","#673ab7","#291749","#2196f3","#063d69","#00bcd4","#004b55","#4caf50","#1e4620","#cddc39","#575e11","#ffc107","#694f00","#9e9e9e","#3f3f3f","#3f51b5","#192048","#ff5722","#741c00","#795548","#30221d"],"Data":[[0,1],[2,3],[4,5],[6,7],[8,9],[10,11],[12,13],[14,15],[16,17],[18,19],[20,21],[22,23],[24,25],[26,27],[28,29],[30,31],[0,1],[2,3],[32,33],[4,5],[6,7],[8,9],[10,11],[12,13],[14,15],[16,17],[18,19],[20,21],[22,23],[24,25],[26,27],[28,29],[34,35],[30,31],[0,1],[2,3],[32,33],[4,5],[6,7],[10,11],[12,13],[14,15],[16,17],[18,19],[20,21],[22,23],[24,25],[26,27],[28,29],[34,35],[30,31],[0,1],[2,3],[32,33],[6,7],[8,9],[10,11],[12,13],[16,17],[20,21],[22,23],[26,27],[28,29],[30,31],[0,1],[2,3],[32,33],[4,5],[6,7],[8,9],[10,11],[12,13],[14,15],[18,19],[20,21],[22,23],[24,25],[26,27],[28,29],[34,35],[30,31],[0,1],[2,3],[32,33],[4,5],[6,7],[8,9],[10,11],[12,13],[36,37],[14,15],[16,17],[18,19],[20,21],[22,23],[24,25],[26,27],[28,29],[34,35],[30,31],[2,3],[32,33],[4,5],[6,7]],"Space":null},"ColorLock":null,"LabelRepeat":1,"ThumbnailUrl":"","Confirmed":true,"TextDisplayType":null,"Flagged":false,"DateModified":"2020-02-05T05:14:","CategoryId":3,"Weights":[],"WheelKey":"what-is-the-best-girl-name"}