MilesCranmer commited on
Commit
a86f107
1 Parent(s): dc9d777

More instructions added

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -105,8 +105,6 @@ for:
105
  # TODO
106
 
107
  - [ ] Hyperparameter tune
108
- - [ ] Add interface for either defining an operation to learn, or loading in arbitrary dataset.
109
- - Could just write out the dataset in julia, or load it.
110
  - [ ] Add mutation for constant<->variable
111
  - [ ] Create a benchmark for accuracy
112
  - [ ] Use NN to generate weights over all probability distribution conditional on error and existing equation, and train on some randomly-generated equations
@@ -117,6 +115,8 @@ for:
117
  - Seems like its necessary right now. But still by far the slowest option.
118
  - [ ] Calculating the loss function - there is duplicate calculations happening.
119
  - [ ] Declaration of the weights array every iteration
 
 
120
  - [x] Create a Python interface
121
  - [x] Explicit constant optimization on hall-of-fame
122
  - Create method to find and return all constants, from left to right
 
105
  # TODO
106
 
107
  - [ ] Hyperparameter tune
 
 
108
  - [ ] Add mutation for constant<->variable
109
  - [ ] Create a benchmark for accuracy
110
  - [ ] Use NN to generate weights over all probability distribution conditional on error and existing equation, and train on some randomly-generated equations
 
115
  - Seems like its necessary right now. But still by far the slowest option.
116
  - [ ] Calculating the loss function - there is duplicate calculations happening.
117
  - [ ] Declaration of the weights array every iteration
118
+ - [x] Add interface for either defining an operation to learn, or loading in arbitrary dataset.
119
+ - Could just write out the dataset in julia, or load it.
120
  - [x] Create a Python interface
121
  - [x] Explicit constant optimization on hall-of-fame
122
  - Create method to find and return all constants, from left to right