Skip to content

Commit

Permalink
Allow for initial config to be set
Browse files Browse the repository at this point in the history
  • Loading branch information
mseeger committed Dec 7, 2022
1 parent a1b546c commit 5c00f03
Show file tree
Hide file tree
Showing 4 changed files with 28 additions and 18 deletions.
8 changes: 4 additions & 4 deletions chapter_hyperparameter_optimization/hyperband-intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -194,16 +194,16 @@ Let us see how successive halving is doing on our neural network example.
min_number_of_epochs = 1
max_number_of_epochs = 4
search_space = {
"learning_rate": stats.loguniform(1e-4, 1),
"batch_size": stats.randint(8, 128),
config_space = {
"learning_rate": stats.loguniform(1e-3, 1),
"batch_size": stats.randint(32, 256),
}
```

We just replace the scheduler with our new `SuccessiveHalvingScheduler`.

```{.python .input n=14}
searcher = d2l.RandomSearcher(search_space)
searcher = d2l.RandomSearcher(config_space)
scheduler = SuccessiveHalvingScheduler(
searcher=searcher,
eta=2,
Expand Down
26 changes: 18 additions & 8 deletions chapter_hyperparameter_optimization/hyperopt-api.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,14 +56,19 @@ previous section in this API:
```{.python .input n=4}
%%tab all
class RandomSearcher(HPOSearcher): #@save
def __init__(self, config_space: dict):
def __init__(self, config_space: dict, initial_config=None):
self.save_hyperparameters()
def sample_configuration(self):
return {
name: domain.rvs()
for name, domain in self.config_space.items()
}
if self.initial_config is not None:
result = self.initial_config
self.initial_config = None
else:
result = {
name: domain.rvs()
for name, domain in self.config_space.items()
}
return result
```

### Scheduler
Expand Down Expand Up @@ -190,15 +195,20 @@ We also need to define the configuration space.

```{.python .input n=10}
config_space = {
"learning_rate": stats.loguniform(1e-4, 1),
"batch_size": stats.randint(8, 128),
"learning_rate": stats.loguniform(1e-3, 1),
"batch_size": stats.randint(32, 256),
}
```

Now we can start our random search:

```{.python .input}
searcher = RandomSearcher(config_space)
# We start with sensible defaults
initial_config = {
"learning_rate": 0.1,
"batch_size": 128,
}
searcher = RandomSearcher(config_space, initial_config=initial_config)
scheduler = BasicScheduler(searcher=searcher)
tuner = HPOTuner(scheduler=scheduler, objective=hpo_objective_lenet)
tuner.run(number_of_trials=5)
Expand Down
6 changes: 3 additions & 3 deletions chapter_hyperparameter_optimization/rs-async.md
Original file line number Diff line number Diff line change
Expand Up @@ -110,9 +110,9 @@ We make use of this feature in order to pass `max_epochs`.

```{.python .input n=39}
config_space = {
"learning_rate": loguniform(1e-5, 1e-1),
"batch_size": randint(8, 128),
"max_epochs": 4,
"learning_rate": loguniform(1e-3, 1),
"batch_size": randint(32, 256),
"max_epochs": 8,
}
```

Expand Down
6 changes: 3 additions & 3 deletions chapter_hyperparameter_optimization/sh-async.md
Original file line number Diff line number Diff line change
Expand Up @@ -110,11 +110,11 @@ def hpo_objective_lenet_synetune(learning_rate, batch_size, max_epochs):
We will also use the same configuration space as before:

```{.python .input n=55}
max_epochs = 4
max_epochs = 8
config_space = {
"learning_rate": loguniform(1e-5, 1e-1),
"batch_size": randint(8, 128),
"learning_rate": loguniform(1e-3, 1),
"batch_size": randint(32, 256),
"max_epochs": max_epochs
}
```
Expand Down

0 comments on commit 5c00f03

Please sign in to comment.