Swift
09/18/2025, 11:27 PMGalen Seilis
09/18/2025, 11:46 PMPaul Haakma
09/22/2025, 8:09 AMAnton Nikishin
09/26/2025, 9:46 AMconf/dev/databricks.ymldefault:
  tasks:
  - existing_cluster_id: 0924-121047-3jcdtqh1kedro databricks bundle --overwriteAssertionError: lookup_key task_key not found in updates: [{'existing_cluster_id': '0924-121047-3jcdtqh1'}]Nik Linnane
10/01/2025, 6:19 PMinitbundledeploy---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
File ~/.ipykernel/6689/command--1-4096408574:22
     20 import importlib
     21 module = importlib.import_module("classification_pipeline")
---> 22 module.classification_pipeline()
AttributeError: module 'classification_pipeline' has no attribute 'classification_pipeline'devqaprodconfqakedro databricks initkedro databricks bundle --env qa --params runner=ThreadRunnerkedro databricks deploy --env qa --runtime-params runner=ThreadRunnerkedro databricks run classification_pipelineShah
10/02/2025, 10:49 AMparameters_data_processing.ymlcolumn_rename_params: # Suffix to be added to overlapping columns
    skip_cols: ['Date'] # Columns to skip while renaming
    co2: '_co2'
    motion: '_motion'
    presence: '_presence'
data_clean_params:
  V2_motion: {
        condition: '<0',
        new_val: 0
        }
  V2_presence: {
        condition: '<0',
        new_val: 0
        }
  infinite_values:
    infinite_val_remove: true
    infinite_val_conditions:
      - column_name: V2_motion
        lower_bound: -1e10
        upper_bound: 1e10
      - column_name: V2_presence
        lower_bound: -1e10
        upper_bound: 1e10column_rename_params['co2']column_rename_params['motion']inputs=['co2_processed', 'params:column_rename_params:co2', 'params:column_rename_params:skip_cols']"not found in the DataCatalog"catalog.yml"params:column_rename_params"'params:<key>''parameters:<key>'Shah
10/03/2025, 6:22 PMSreekar Reddy
10/04/2025, 9:56 AMMamadou SAMBA
10/06/2025, 3:45 PMsome_dataset:
  type: spark.SparkDataset
  file_format: delta
  filepath: "gs://<bucket-prefix>${runtime_params:env}-dataset/app_usages"''envNonegs://<bucket-prefix>None-dataset/gs://<bucket-prefix>-dataset/"params": build_kedro_params(
    [
        f"project={get_project_id()}",
        f"env={env_param}",  # env_param = ''
        
    ]
)NoneNoneStas
10/07/2025, 4:15 PMStas
10/09/2025, 11:07 AMShah
10/09/2025, 4:47 PMGianni Giordano
10/13/2025, 12:48 PMStas
10/14/2025, 11:13 AMFlavien
10/15/2025, 12:56 PMkedro0.18.121.0.00.19.15def test_custom_resolvers_in_example(
    project_path: Path,
) -> None:
    bootstrap_project(project_path=project_path)
    # Default value
    with KedroSession.create(
        project_path=project_path,
        env="example",
    ) as session:
        context = session.load_context()
        catalog = context._get_catalog()
        assert timedelta(days=1) == catalog.load("params:example-duration")
        assert datetime(1970, 1, 1, tzinfo=timezone.utc) == catalog.load(
            "params:example-epoch"
        )0.18.12CONFIG_LOADER_CLASS = OmegaConfigLoader0.19.15E           ValueError: Duplicate keys found in .../conf/local/catalog.yml and .../conf/production/catalog.yml: hourly_forecasts, output_hourly_forecastsDuplicate keysStas
10/15/2025, 1:17 PMFlavien
10/15/2025, 3:33 PMkedro-0.19.15.tar.gzKedroSession.createkedro.framework.sessionextra_paramsruntime_paramsStas
10/16/2025, 1:32 PMdef after_context_created(self, context):
    creds = get_credentials*(***url***, ***account***)*context.config_loader["credentials"] = {          
         **context.config_loader["credentials"],**creds}Paul Haakma
10/17/2025, 4:06 AMAyushi
10/17/2025, 11:04 AMStas
10/20/2025, 1:58 PMPascal Brokmeier
10/20/2025, 2:08 PMTim Deller
10/21/2025, 10:30 AMShu-Chun Wu
10/24/2025, 2:14 PMMohamed El Guendouz
10/24/2025, 3:32 PMAyushi
10/27/2025, 6:35 AMInterpolationResolutionErrorCONFIG_LOADER_ARGS = {
 "custom_resolvers" : {
"Our_resolver": reference to resolver
}NAYAN JAIN
10/29/2025, 1:56 PMweather:
 type: polars.EagerPolarsDataset
 filepath: <s3a://your_bucket/data/01_raw/weather*>
 file_format: csv
 credentials: ${s3_creds:123456789012,arn:role}Raghav Singh
10/29/2025, 6:51 PMSejal Singh
10/30/2025, 8:59 AMChekeb Panschiri
10/30/2025, 4:02 PM