Hyperliquid End-to-End
Build a model on CrowdCent's training data and submit¶
In [1]:
Copied!
!pip install crowdcent-challenge
import crowdcent_challenge as cc
import polars as pl
from xgboost import XGBRegressor
!pip install crowdcent-challenge
import crowdcent_challenge as cc
import polars as pl
from xgboost import XGBRegressor
/bin/bash: pip: command not found
For this tutorial, you will need:
- CrowdCent account: register for free
- CrowdCent API Key: generate an API key from your user profile
Feeling more advanced? Check out the multi-slot, multi-dataset end-to-end example
Load API key¶
In [2]:
Copied!
CROWDCENT_API_KEY = "API_KEY_HERE"
CROWDCENT_API_KEY = "API_KEY_HERE"
Initialize the client¶
In [ ]:
Copied!
client = cc.ChallengeClient(
challenge_slug="hyperliquid-ranking",
api_key=CROWDCENT_API_KEY,
)
client = cc.ChallengeClient(
challenge_slug="hyperliquid-ranking",
api_key=CROWDCENT_API_KEY,
)
2026-02-04 21:19:40,206 - INFO - ChallengeClient initialized for 'hyperliquid-ranking' at URL: https://crowdcent.com/api
Get CrowdCent's training data¶
In [4]:
Copied!
client.download_training_dataset(version="latest", dest_path="training_data.parquet")
training_data = pl.read_parquet("training_data.parquet")
training_data.head()
client.download_training_dataset(version="latest", dest_path="training_data.parquet")
training_data = pl.read_parquet("training_data.parquet")
training_data.head()
2026-02-04 21:19:40,508 - INFO - Downloading training data v2.0 to training_data.parquet Downloading training_data.parquet: 100%|██████████| 124M/124M [00:02<00:00, 55.0MB/s] 2026-02-04 21:19:43,278 - INFO - Successfully downloaded training data v2.0 to training_data.parquet
Out[4]:
shape: (5, 85)
| id | eodhd_id | date | feature_16_lag15 | feature_13_lag15 | feature_14_lag15 | feature_15_lag15 | feature_8_lag15 | feature_5_lag15 | feature_6_lag15 | feature_7_lag15 | feature_12_lag15 | feature_9_lag15 | feature_10_lag15 | feature_11_lag15 | feature_4_lag15 | feature_1_lag15 | feature_2_lag15 | feature_3_lag15 | feature_20_lag15 | feature_17_lag15 | feature_18_lag15 | feature_19_lag15 | feature_16_lag10 | feature_13_lag10 | feature_14_lag10 | feature_15_lag10 | feature_8_lag10 | feature_5_lag10 | feature_6_lag10 | feature_7_lag10 | feature_12_lag10 | feature_9_lag10 | feature_10_lag10 | feature_11_lag10 | feature_4_lag10 | feature_1_lag10 | … | feature_5_lag5 | feature_6_lag5 | feature_7_lag5 | feature_12_lag5 | feature_9_lag5 | feature_10_lag5 | feature_11_lag5 | feature_4_lag5 | feature_1_lag5 | feature_2_lag5 | feature_3_lag5 | feature_20_lag5 | feature_17_lag5 | feature_18_lag5 | feature_19_lag5 | feature_16_lag0 | feature_13_lag0 | feature_14_lag0 | feature_15_lag0 | feature_8_lag0 | feature_5_lag0 | feature_6_lag0 | feature_7_lag0 | feature_12_lag0 | feature_9_lag0 | feature_10_lag0 | feature_11_lag0 | feature_4_lag0 | feature_1_lag0 | feature_2_lag0 | feature_3_lag0 | feature_20_lag0 | feature_17_lag0 | feature_18_lag0 | feature_19_lag0 | target_10d | target_30d |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| str | str | datetime[μs] | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | … | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 |
| "0G" | "0G-USD.CC" | 2025-11-16 00:00:00 | 0.157692 | 0.156336 | 0.239762 | 0.313349 | 0.051923 | 0.135122 | 0.269735 | 0.300353 | 0.189423 | 0.200007 | 0.227781 | 0.336593 | 0.161538 | 0.145204 | 0.238061 | 0.291121 | 0.605769 | 0.587816 | 0.566855 | 0.516127 | 0.264423 | 0.211058 | 0.241617 | 0.296768 | 0.329808 | 0.190865 | 0.256535 | 0.31913 | 0.407692 | 0.298558 | 0.277047 | 0.321981 | 0.257692 | 0.209615 | … | 0.353349 | 0.244235 | 0.33943 | 0.549011 | 0.478352 | 0.339179 | 0.344593 | 0.496117 | 0.376905 | 0.261054 | 0.311914 | 0.270146 | 0.324496 | 0.456156 | 0.475274 | 0.522488 | 0.5093 | 0.360179 | 0.328206 | 0.611483 | 0.494187 | 0.342526 | 0.351089 | 0.536842 | 0.542926 | 0.420742 | 0.331381 | 0.524402 | 0.510259 | 0.359937 | 0.322647 | 0.588517 | 0.429332 | 0.46082 | 0.494638 | 0.832536 | 0.186603 |
| "0G" | "0G-USD.CC" | 2025-11-17 00:00:00 | 0.160577 | 0.145192 | 0.238297 | 0.291124 | 0.035577 | 0.134135 | 0.257164 | 0.280925 | 0.021154 | 0.200481 | 0.224395 | 0.312074 | 0.033654 | 0.146154 | 0.224768 | 0.267206 | 0.609615 | 0.603365 | 0.603132 | 0.512047 | 0.257692 | 0.209135 | 0.236536 | 0.277681 | 0.332692 | 0.184135 | 0.25169 | 0.317794 | 0.417308 | 0.219231 | 0.271717 | 0.302891 | 0.334615 | 0.184135 | … | 0.420793 | 0.277464 | 0.332612 | 0.639856 | 0.528582 | 0.364531 | 0.333954 | 0.594281 | 0.464448 | 0.305301 | 0.309641 | 0.431731 | 0.338462 | 0.470913 | 0.474799 | 0.524402 | 0.51072 | 0.359927 | 0.322639 | 0.470813 | 0.489853 | 0.336994 | 0.348187 | 0.466986 | 0.553421 | 0.386326 | 0.333876 | 0.41244 | 0.503361 | 0.343748 | 0.328964 | 0.400957 | 0.416344 | 0.421874 | 0.471446 | 0.779904 | 0.167464 |
| "0G" | "0G-USD.CC" | 2025-11-18 00:00:00 | 0.032692 | 0.145673 | 0.225003 | 0.267208 | 0.225 | 0.228846 | 0.280594 | 0.305561 | 0.215385 | 0.281731 | 0.264013 | 0.313656 | 0.225962 | 0.228365 | 0.268489 | 0.291843 | 0.769231 | 0.639423 | 0.616418 | 0.536684 | 0.334615 | 0.183654 | 0.25431 | 0.283911 | 0.332692 | 0.278846 | 0.297583 | 0.341345 | 0.392308 | 0.303846 | 0.313781 | 0.324164 | 0.335577 | 0.280769 | … | 0.339453 | 0.28415 | 0.320811 | 0.62851 | 0.510409 | 0.39607 | 0.330011 | 0.501458 | 0.418518 | 0.323442 | 0.322918 | 0.242262 | 0.243727 | 0.441575 | 0.474793 | 0.41244 | 0.503821 | 0.343737 | 0.329076 | 0.484211 | 0.415212 | 0.347029 | 0.348368 | 0.355024 | 0.491767 | 0.397807 | 0.338166 | 0.453589 | 0.477523 | 0.379146 | 0.343882 | 0.476555 | 0.359408 | 0.43331 | 0.48185 | 0.885167 | 0.22488 |
| "0G" | "0G-USD.CC" | 2025-11-19 00:00:00 | 0.225 | 0.227885 | 0.268724 | 0.291845 | 0.327885 | 0.270673 | 0.302442 | 0.319021 | 0.409615 | 0.377404 | 0.30822 | 0.338053 | 0.261538 | 0.217788 | 0.270617 | 0.296166 | 0.607692 | 0.584615 | 0.588395 | 0.535952 | 0.335577 | 0.280288 | 0.29972 | 0.30867 | 0.372115 | 0.35 | 0.290399 | 0.359733 | 0.394231 | 0.401923 | 0.355811 | 0.347837 | 0.495192 | 0.378365 | … | 0.316703 | 0.293688 | 0.328585 | 0.489759 | 0.441995 | 0.409699 | 0.335865 | 0.335922 | 0.415557 | 0.316673 | 0.324101 | 0.228722 | 0.250419 | 0.417517 | 0.471778 | 0.453589 | 0.477984 | 0.379136 | 0.343994 | 0.567464 | 0.414377 | 0.382189 | 0.347465 | 0.36555 | 0.427655 | 0.414789 | 0.32552 | 0.440191 | 0.388057 | 0.383211 | 0.325019 | 0.552153 | 0.390438 | 0.415171 | 0.485846 | 0.870813 | 0.301435 |
| "0G" | "0G-USD.CC" | 2025-11-20 00:00:00 | 0.260577 | 0.217308 | 0.270853 | 0.296168 | 0.329808 | 0.251923 | 0.289152 | 0.318531 | 0.414423 | 0.300481 | 0.283331 | 0.338529 | 0.265385 | 0.211538 | 0.241381 | 0.296764 | 0.445192 | 0.578846 | 0.564472 | 0.513369 | 0.495192 | 0.377885 | 0.301414 | 0.332464 | 0.377885 | 0.353846 | 0.243044 | 0.338834 | 0.546154 | 0.480288 | 0.346681 | 0.3446 | 0.495192 | 0.380288 | … | 0.402339 | 0.327131 | 0.328132 | 0.361722 | 0.453938 | 0.377209 | 0.31916 | 0.522488 | 0.50884 | 0.360189 | 0.328094 | 0.419139 | 0.342742 | 0.460794 | 0.47396 | 0.440191 | 0.388517 | 0.383201 | 0.325131 | 0.402871 | 0.414833 | 0.384339 | 0.327887 | 0.42488 | 0.393301 | 0.436795 | 0.332415 | 0.308134 | 0.415311 | 0.3978 | 0.318543 | 0.427751 | 0.423445 | 0.389607 | 0.490231 | 0.799043 | 0.124402 |
Train a model on the training data¶
In [5]:
Copied!
xgb_regressor = XGBRegressor(n_estimators=200, device="cuda")
feature_names = [col for col in training_data.columns if col.startswith("feature")]
xgb_regressor.fit(
training_data[feature_names],
training_data[["target_10d", "target_30d"]],
)
xgb_regressor = XGBRegressor(n_estimators=200, device="cuda")
feature_names = [col for col in training_data.columns if col.startswith("feature")]
xgb_regressor.fit(
training_data[feature_names],
training_data[["target_10d", "target_30d"]],
)
Out[5]:
XGBRegressor(base_score=None, booster=None, callbacks=None,
colsample_bylevel=None, colsample_bynode=None,
colsample_bytree=None, device='cuda', early_stopping_rounds=None,
enable_categorical=False, eval_metric=None, feature_types=None,
feature_weights=None, gamma=None, grow_policy=None,
importance_type=None, interaction_constraints=None,
learning_rate=None, max_bin=None, max_cat_threshold=None,
max_cat_to_onehot=None, max_delta_step=None, max_depth=None,
max_leaves=None, min_child_weight=None, missing=nan,
monotone_constraints=None, multi_strategy=None, n_estimators=200,
n_jobs=None, num_parallel_tree=None, ...)In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook. On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
Parameters
Get CrowdCent's latest inference data¶
In [6]:
Copied!
client.download_inference_data("latest", "inference_data.parquet")
inference_data = pl.read_parquet("inference_data.parquet")
inference_data.head()
client.download_inference_data("latest", "inference_data.parquet")
inference_data = pl.read_parquet("inference_data.parquet")
inference_data.head()
2026-02-04 21:19:46,297 - INFO - Downloading inference data 2026-02-04 to inference_data.parquet Downloading inference_data.parquet: 100%|██████████| 138k/138k [00:00<00:00, 65.9MB/s] 2026-02-04 21:19:46,465 - INFO - Successfully downloaded inference data 2026-02-04 to inference_data.parquet
Out[6]:
shape: (5, 83)
| id | eodhd_id | date | feature_16_lag15 | feature_13_lag15 | feature_14_lag15 | feature_15_lag15 | feature_8_lag15 | feature_5_lag15 | feature_6_lag15 | feature_7_lag15 | feature_12_lag15 | feature_9_lag15 | feature_10_lag15 | feature_11_lag15 | feature_4_lag15 | feature_1_lag15 | feature_2_lag15 | feature_3_lag15 | feature_20_lag15 | feature_17_lag15 | feature_18_lag15 | feature_19_lag15 | feature_16_lag10 | feature_13_lag10 | feature_14_lag10 | feature_15_lag10 | feature_8_lag10 | feature_5_lag10 | feature_6_lag10 | feature_7_lag10 | feature_12_lag10 | feature_9_lag10 | feature_10_lag10 | feature_11_lag10 | feature_4_lag10 | feature_1_lag10 | … | feature_15_lag5 | feature_8_lag5 | feature_5_lag5 | feature_6_lag5 | feature_7_lag5 | feature_12_lag5 | feature_9_lag5 | feature_10_lag5 | feature_11_lag5 | feature_4_lag5 | feature_1_lag5 | feature_2_lag5 | feature_3_lag5 | feature_20_lag5 | feature_17_lag5 | feature_18_lag5 | feature_19_lag5 | feature_16_lag0 | feature_13_lag0 | feature_14_lag0 | feature_15_lag0 | feature_8_lag0 | feature_5_lag0 | feature_6_lag0 | feature_7_lag0 | feature_12_lag0 | feature_9_lag0 | feature_10_lag0 | feature_11_lag0 | feature_4_lag0 | feature_1_lag0 | feature_2_lag0 | feature_3_lag0 | feature_20_lag0 | feature_17_lag0 | feature_18_lag0 | feature_19_lag0 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| str | str | datetime[μs] | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | … | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 |
| "BIO" | "BIO.CC" | 2026-02-03 00:00:00 | 0.598925 | 0.46837 | 0.450671 | 0.487363 | 0.515054 | 0.55032 | 0.502187 | 0.518391 | 0.67957 | 0.547472 | 0.506168 | 0.515922 | 0.652688 | 0.546998 | 0.487283 | 0.487425 | 0.530108 | 0.512671 | 0.498768 | 0.4706 | 0.703226 | 0.651075 | 0.516748 | 0.510942 | 0.705376 | 0.610215 | 0.533936 | 0.543455 | 0.736559 | 0.708065 | 0.559768 | 0.540695 | 0.686022 | 0.669355 | … | 0.496118 | 0.247312 | 0.476344 | 0.513332 | 0.518288 | 0.163441 | 0.45 | 0.498736 | 0.500854 | 0.2 | 0.443011 | 0.495004 | 0.485205 | 0.389247 | 0.476882 | 0.494776 | 0.491848 | 0.28165 | 0.31018 | 0.480627 | 0.464973 | 0.240447 | 0.243879 | 0.427047 | 0.465506 | 0.216531 | 0.189986 | 0.449025 | 0.458326 | 0.301702 | 0.250851 | 0.460103 | 0.453864 | 0.501737 | 0.445492 | 0.496402 | 0.476321 |
| "JUP" | "JUP29210-USD.CC" | 2026-02-03 00:00:00 | 0.519355 | 0.51587 | 0.551449 | 0.538697 | 0.431183 | 0.51299 | 0.554874 | 0.524464 | 0.507527 | 0.512993 | 0.543253 | 0.519194 | 0.441935 | 0.515705 | 0.537852 | 0.536899 | 0.484946 | 0.51145 | 0.517347 | 0.506106 | 0.539785 | 0.52957 | 0.533422 | 0.560765 | 0.55914 | 0.495161 | 0.537902 | 0.561383 | 0.534409 | 0.520968 | 0.526315 | 0.548022 | 0.594624 | 0.51828 | … | 0.561927 | 0.597849 | 0.578495 | 0.545743 | 0.558006 | 0.733333 | 0.633871 | 0.573432 | 0.554959 | 0.622581 | 0.608602 | 0.562153 | 0.555401 | 0.629032 | 0.553763 | 0.532607 | 0.515087 | 0.50166 | 0.591153 | 0.560361 | 0.545175 | 0.470844 | 0.534347 | 0.514754 | 0.532943 | 0.53588 | 0.634607 | 0.577787 | 0.543836 | 0.487493 | 0.555037 | 0.536658 | 0.538635 | 0.497737 | 0.563385 | 0.522553 | 0.509872 |
| "BABY" | "BABY32198-USD.CC" | 2026-02-03 00:00:00 | 0.648387 | 0.573011 | 0.498397 | 0.509604 | 0.689247 | 0.484522 | 0.469018 | 0.503833 | 0.688172 | 0.5124 | 0.512416 | 0.515668 | 0.683871 | 0.551645 | 0.499336 | 0.505479 | 0.573118 | 0.468003 | 0.473191 | 0.472271 | 0.636559 | 0.642473 | 0.564834 | 0.528498 | 0.651613 | 0.67043 | 0.545164 | 0.521096 | 0.6 | 0.644086 | 0.570795 | 0.521884 | 0.578495 | 0.631183 | … | 0.520413 | 0.677419 | 0.664516 | 0.574519 | 0.525908 | 0.673118 | 0.636559 | 0.57448 | 0.541699 | 0.703226 | 0.64086 | 0.596253 | 0.531505 | 0.54086 | 0.513441 | 0.490722 | 0.48428 | 0.68777 | 0.626681 | 0.634577 | 0.546385 | 0.626864 | 0.652142 | 0.661286 | 0.541834 | 0.722811 | 0.697965 | 0.671025 | 0.574348 | 0.613299 | 0.658262 | 0.644723 | 0.550194 | 0.313801 | 0.427331 | 0.47845 | 0.451884 |
| "CHILLGUY" | "CHILLGUY-USD.CC" | 2026-02-03 00:00:00 | 0.229032 | 0.407475 | 0.562386 | 0.515923 | 0.239785 | 0.367155 | 0.546821 | 0.52314 | 0.268817 | 0.390139 | 0.565881 | 0.521184 | 0.250538 | 0.332299 | 0.537501 | 0.515237 | 0.53871 | 0.49311 | 0.495474 | 0.523818 | 0.224731 | 0.226882 | 0.404785 | 0.488474 | 0.331183 | 0.285484 | 0.411779 | 0.495078 | 0.143011 | 0.205914 | 0.411363 | 0.466087 | 0.347312 | 0.298925 | … | 0.492309 | 0.522581 | 0.426882 | 0.397018 | 0.499996 | 0.533333 | 0.338172 | 0.364156 | 0.480456 | 0.402151 | 0.374731 | 0.353515 | 0.47973 | 0.434409 | 0.477957 | 0.485533 | 0.51101 | 0.423053 | 0.473355 | 0.350118 | 0.481812 | 0.501182 | 0.511881 | 0.398683 | 0.497914 | 0.500508 | 0.516921 | 0.361417 | 0.490587 | 0.547052 | 0.474601 | 0.386763 | 0.487977 | 0.561231 | 0.49782 | 0.513964 | 0.512921 |
| "SPX" | "SPX28081-USD.CC" | 2026-02-03 00:00:00 | 0.323656 | 0.480026 | 0.50731 | 0.484061 | 0.21828 | 0.418454 | 0.470578 | 0.443127 | 0.258065 | 0.437998 | 0.472512 | 0.456526 | 0.263441 | 0.418997 | 0.473823 | 0.476101 | 0.5 | 0.493638 | 0.514927 | 0.492058 | 0.184946 | 0.254301 | 0.377331 | 0.45326 | 0.109677 | 0.163978 | 0.325295 | 0.43062 | 0.222581 | 0.240323 | 0.368158 | 0.45489 | 0.156989 | 0.210215 | … | 0.447719 | 0.42043 | 0.265054 | 0.341754 | 0.435607 | 0.473118 | 0.347849 | 0.392924 | 0.453489 | 0.41828 | 0.287634 | 0.353316 | 0.441658 | 0.413978 | 0.504301 | 0.49897 | 0.508269 | 0.375854 | 0.388464 | 0.321383 | 0.436187 | 0.342887 | 0.381658 | 0.272818 | 0.411711 | 0.5176 | 0.495359 | 0.367841 | 0.456702 | 0.420826 | 0.419553 | 0.314884 | 0.43291 | 0.539673 | 0.476826 | 0.512069 | 0.511539 |
Make predictions on the inference data¶
In [7]:
Copied!
preds = xgb_regressor.predict(inference_data[feature_names])
pred_df = pl.from_numpy(preds, ["pred_10d", "pred_30d"])
preds = xgb_regressor.predict(inference_data[feature_names])
pred_df = pl.from_numpy(preds, ["pred_10d", "pred_30d"])
/home/exx/projects/crowdcent-challenge/.venv/lib/python3.12/site-packages/xgboost/core.py:774: UserWarning: [21:19:46] WARNING: /workspace/src/common/error_msg.cc:62: Falling back to prediction using DMatrix due to mismatched devices. This might lead to higher memory usage and slower performance. XGBoost is running on: cuda:0, while the input data is on: cpu. Potential solutions: - Use a data structure that matches the device ordinal in the booster. - Set the device for booster before call to inplace_predict. This warning will only be shown once. return func(**kwargs)
Submit to the hyperliquid-ranking challenge on CrowdCent¶
In [8]:
Copied!
pred_df = pred_df.with_columns(inference_data["id"]).select(
["id", "pred_10d", "pred_30d"]
)
# ensure predictions are between 0 and 1
pred_df = pred_df.with_columns(pl.col(["pred_10d", "pred_30d"]).clip(0, 1))
with pl.Config(tbl_rows=20):
display(pred_df.sort("pred_30d", descending=True))
pred_df = pred_df.with_columns(inference_data["id"]).select(
["id", "pred_10d", "pred_30d"]
)
# ensure predictions are between 0 and 1
pred_df = pred_df.with_columns(pl.col(["pred_10d", "pred_30d"]).clip(0, 1))
with pl.Config(tbl_rows=20):
display(pred_df.sort("pred_30d", descending=True))
shape: (186, 3)
| id | pred_10d | pred_30d |
|---|---|---|
| str | f32 | f32 |
| "SOPH" | 0.481924 | 0.825876 |
| "BERA" | 0.363361 | 0.745414 |
| "HYPE" | 0.44265 | 0.693696 |
| "XPL" | 0.705975 | 0.667599 |
| "MERL" | 0.624524 | 0.642958 |
| "STRK" | 0.599383 | 0.632716 |
| "ALGO" | 0.541823 | 0.620179 |
| "TRX" | 0.494462 | 0.608403 |
| "LDO" | 0.540888 | 0.608148 |
| "ZRO" | 0.447225 | 0.607567 |
| … | … | … |
| "JTO" | 0.294615 | 0.351979 |
| "TNSR" | 0.410136 | 0.350033 |
| "AVNT" | 0.198144 | 0.342407 |
| "XMR" | 0.495307 | 0.339069 |
| "BIO" | 0.316203 | 0.298175 |
| "LAYER" | 0.630956 | 0.290479 |
| "DASH" | 0.309354 | 0.289107 |
| "BLUR" | 0.241445 | 0.286826 |
| "CC" | 0.676866 | 0.24413 |
| "AXS" | 0.536369 | 0.209781 |
In [9]:
Copied!
# directly submit a dataframe to slot 1
client.submit_predictions(df=pred_df, slot=1)
# directly submit a dataframe to slot 1
client.submit_predictions(df=pred_df, slot=1)
2026-02-04 21:19:46,617 - INFO - Wrote DataFrame to temporary file: submission.parquet 2026-02-04 21:19:46,618 - INFO - Submitting predictions from submission.parquet to challenge 'hyperliquid-ranking' (Slot: 1) 2026-02-04 21:19:47,091 - INFO - Submission queued (slot 1)
Out[9]:
{'match_info': {'matched_ids': 186,
'unmatched_ids': 0,
'message': '186 IDs matched inference data'},
'status': 'queued',
'message': 'Submission queued for slot 1. Will be automatically submitted when next period opens.',
'slot': 1,
'challenge': 'hyperliquid-ranking'}