| Dependency | Reason |
|---|---|
| Task Instance State | Task is in the 'success' state which is not a valid state for execution. The task must be cleared in order to be run. |
| Dagrun Running | Task instance's dagrun was not in the 'running' state but in the state 'success'. |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 | def generar_cruce_wenco_haul_shovel(origin_shovel_file_container, origin_shovel_file_blob, origin_haul_file_container, origin_haul_file_blob, origin_load_file_container, origin_load_file_blob, final_container, final_blob): """ Función que permite cruzar los archivos diarios que vienen desde wenco Parámetros ---------- origin_shovel_file_container: str Nombre del contenedor donde se encuentra el archivo de palas. origin_shovel_file_blob: str Nombre del archivo que contiene la información de las palas. origin_haul_file_container: str Nombre del contenedor donde se encuentra el archivo de camiones. origin_haul_file_blob: str Nombre del archivo que contiene la información de los camiones. origin_load_file_container: str Nombre del contenedor donde se encuentra el archivo que permite unir palas/camiones. origin_load_file_blob: str Nombre del archivo que contiene la información que permite unir palas/camiones. final_container: str Nombre del contenedor donde se guardará el archivo limpio. final_blob: str Nombre del archivo que se guardará en el contenedor final. Returns ------- None """ # Read wenco shovel file conn_origin_container_shovel = BlobClient.from_connection_string(conn_str=BLOB_CONNECT_STRING, container_name=origin_shovel_file_container, blob_name=origin_shovel_file_blob) download_stream_shovel = conn_origin_container_shovel.download_blob().readall() data_shovel = json.loads(download_stream_shovel) df_shovel = pd.DataFrame(data_shovel['values'], columns=data_shovel['headers']) # Read Wenco Load file conn_origin_container_load = BlobClient.from_connection_string(conn_str=BLOB_CONNECT_STRING, container_name=origin_load_file_container, blob_name=origin_load_file_blob) download_stream_load = conn_origin_container_load.download_blob().readall() data_load = json.loads(download_stream_load) df_load = pd.DataFrame(data_load['values'], columns=data_load['headers']) # Read Wenco Haul file conn_origin_container_haul = BlobClient.from_connection_string(conn_str=BLOB_CONNECT_STRING, container_name=origin_haul_file_container, blob_name=origin_haul_file_blob) download_stream_haul = conn_origin_container_haul.download_blob().readall() data_haul = json.loads(download_stream_haul) df_haul = pd.DataFrame(data_haul['values'], columns=data_haul['headers']) ## JOIN FILES df_load = df_load[['LOAD_REC_IDENT','HAUL_CYCLE_REC_IDENT']] # Firts join Shovel/Load df_ShovelLoad_outer = df_shovel.merge(df_load, how='left', on=['LOAD_REC_IDENT'], indicator=True) df_ShovelLoad = df_ShovelLoad_outer.loc[df_ShovelLoad_outer['_merge']=='both'].drop(['_merge'], axis=1) df_ShovelLoad = df_ShovelLoad.loc[df_ShovelLoad['HAUL_CYCLE_REC_IDENT']!='N\\A'] df_ShovelLoad['HAUL_CYCLE_REC_IDENT'] = df_ShovelLoad['HAUL_CYCLE_REC_IDENT'].astype('float64') # Second join with haul df_ShovelLoadHaul_outer = df_ShovelLoad.merge(df_haul, how='left', on=['HAUL_CYCLE_REC_IDENT'], indicator=True) df = df_ShovelLoadHaul_outer.loc[df_ShovelLoadHaul_outer['_merge'].isin(['both','left'])].drop(['_merge'], axis=1) # Write file to blob blob = BlobClient.from_connection_string(conn_str=BLOB_CONNECT_STRING, container_name=final_container, blob_name=final_blob) json_data = df.to_json(index=False, orient='table') blob.upload_blob(json_data, overwrite=True) |
| Attribute | Value |
|---|---|
| dag_id | SierraGorda_ParametersStorage |
| duration | 0.935864 |
| end_date | 2026-01-02 12:45:49.249295+00:00 |
| execution_date | 2026-01-02T00:45:00+00:00 |
| executor_config | {} |
| generate_command | <function TaskInstance.generate_command at 0x72238e9a0040> |
| hostname | 9c59c2b0da67 |
| is_premature | False |
| job_id | 338 |
| key | ('SierraGorda_ParametersStorage', 'fragmentation_joinWencoFiles', <Pendulum [2026-01-02T00:45:00+00:00]>, 2) |
| log | <Logger airflow.task (INFO)> |
| log_filepath | /usr/local/airflow/logs/SierraGorda_ParametersStorage/fragmentation_joinWencoFiles/2026-01-02T00:45:00+00:00.log |
| log_url | http://localhost:8080/admin/airflow/log?execution_date=2026-01-02T00%3A45%3A00%2B00%3A00&task_id=fragmentation_joinWencoFiles&dag_id=SierraGorda_ParametersStorage |
| logger | <Logger airflow.task (INFO)> |
| mark_success_url | http://localhost:8080/success?task_id=fragmentation_joinWencoFiles&dag_id=SierraGorda_ParametersStorage&execution_date=2026-01-02T00%3A45%3A00%2B00%3A00&upstream=false&downstream=false |
| max_tries | 0 |
| metadata | MetaData(bind=None) |
| next_try_number | 2 |
| operator | PythonOperator |
| pid | 67130 |
| pool | default_pool |
| prev_attempted_tries | 1 |
| previous_execution_date_success | 2026-01-01 12:45:00+00:00 |
| previous_start_date_success | 2026-01-02 00:45:32.143353+00:00 |
| previous_ti | <TaskInstance: SierraGorda_ParametersStorage.fragmentation_joinWencoFiles 2026-01-01 12:45:00+00:00 [success]> |
| previous_ti_success | <TaskInstance: SierraGorda_ParametersStorage.fragmentation_joinWencoFiles 2026-01-01 12:45:00+00:00 [success]> |
| priority_weight | 5 |
| queue | default |
| queued_dttm | 2026-01-02 12:45:44.637890+00:00 |
| raw | False |
| run_as_user | None |
| start_date | 2026-01-02 12:45:48.313431+00:00 |
| state | success |
| task | <Task(PythonOperator): fragmentation_joinWencoFiles> |
| task_id | fragmentation_joinWencoFiles |
| test_mode | False |
| try_number | 2 |
| unixname | airflow |
| Attribute | Value |
|---|---|
| dag | <DAG: SierraGorda_ParametersStorage> |
| dag_id | SierraGorda_ParametersStorage |
| depends_on_past | False |
| deps | {<TIDep(Not In Retry Period)>, <TIDep(Trigger Rule)>, <TIDep(Previous Dagrun State)>} |
| do_xcom_push | True |
| downstream_list | [<Task(PythonOperator): fragmentation_AddBenchToWenco>] |
| downstream_task_ids | {'fragmentation_AddBenchToWenco'} |
| None | |
| email_on_failure | True |
| email_on_retry | True |
| end_date | None |
| execution_timeout | None |
| executor_config | {} |
| extra_links | [] |
| global_operator_extra_link_dict | {} |
| inlets | [] |
| lineage_data | None |
| log | <Logger airflow.task.operators (INFO)> |
| logger | <Logger airflow.task.operators (INFO)> |
| max_retry_delay | None |
| on_failure_callback | None |
| on_retry_callback | None |
| on_success_callback | None |
| op_args | [] |
| op_kwargs | {'origin_shovel_file_container': 'raw', 'origin_shovel_file_blob': 'SierraGorda/2026-01-02/Wenco-Shovel-cycle.json', 'origin_haul_file_container': 'raw', 'origin_haul_file_blob': 'SierraGorda/2026-01-02/Wenco-Truck-cycle.json', 'origin_load_file_container': 'raw', 'origin_load_file_blob': 'SierraGorda/2026-01-02/LoadTrans.json', 'final_container': 'processed', 'final_blob': 'SierraGorda/Fragmentacion/2026-01-02/WencoShovelHaul.json'} |
| operator_extra_link_dict | {} |
| operator_extra_links | () |
| outlets | [] |
| owner | pedro |
| params | {} |
| pool | default_pool |
| priority_weight | 1 |
| priority_weight_total | 5 |
| provide_context | False |
| queue | default |
| resources | None |
| retries | 0 |
| retry_delay | 0:05:00 |
| retry_exponential_backoff | False |
| run_as_user | None |
| schedule_interval | 45,45 12,0 * * * |
| shallow_copy_attrs | ('python_callable', 'op_kwargs') |
| sla | None |
| start_date | 2023-04-22T00:00:00+00:00 |
| subdag | None |
| task_concurrency | None |
| task_id | fragmentation_joinWencoFiles |
| task_type | PythonOperator |
| template_ext | [] |
| template_fields | ('templates_dict', 'op_args', 'op_kwargs') |
| templates_dict | None |
| trigger_rule | all_success |
| ui_color | #ffefeb |
| ui_fgcolor | #000 |
| upstream_list | [] |
| upstream_task_ids | set() |
| wait_for_downstream | False |
| weight_rule | downstream |