Releases: MilesCranmer/SymbolicRegression.jl
Releases · MilesCranmer/SymbolicRegression.jl
v1.7.1
SymbolicRegression v1.7.1
Merged pull requests:
- Get
loss_function_expression
working on distributed workers (#412) (@MilesCranmer)
Closed issues:
- [BUG]:
scitype
warning (#405)
v1.7.0
What's Changed
New Features
- Parametric Template Expressions: You can now add learnable parameters to template expressions! It's easiest to set this up with the new
@template_spec
macro: (#394)@template_spec(expressions=(f, g), parameters=(amplitude=2, phase=2, offset=1)) do x, class amplitude[class] * cos(f(x) + phase[class]) + g(x)^2 + offset[1] end
- Loss Functions on
AbstractExpression
objects: Newloss_function_expression
parameter enables custom loss functions that operate directly onTemplateExpression
and other expression objects. (#408) - Expression specifications: Rather than setting both
expression_type
andexpression_options
, there is now a unifiedexpression_spec
argument forOptions
andSRRegressor
. UseParametricExpressionSpec
for parametric expressions, andTemplateExpressionSpec
for template expressions. (Though the latter has the@template_spec
shorthand)
Small changes
- Added support for comparison operators (
>
,<
,>=
,<=
) within templates as well as in the operators. (#407)
Deprecations
- The
expression_type
andexpression_options
parameters are now deprecated in favor of the unifiedexpression_spec
interface.
Example Usage
Class-Conditional Model with Learnable Parameters
# Define template with class-specific parameters
model_template = @template_spec(
expressions=(base, modifier),
parameters=(coeff=5,)
) do x, class
coeff[class] * (base(x) + modifier(x^2))
end
# Set up search
model = SRRegressor(
expression_spec=model_template,
binary_operators=[+, *],
unary_operators=[cos],
niterations=500
)
# X contains features x and class labels
X = (x=rand(100) .* 10, class=rand(1:5, 100))
coeffs = [2.0, -0.5, 4.0, 0.2, 1.0]
y = [coeffs[X.class[i]] * (cos(X.x[i]) - X.x[i]^2) for i in 1:100]
using MLJBase: fit!, machine
fit!(machine(model, X, y))
Full Changelog: v1.6.0...v1.7.0
v1.6.0
SymbolicRegression v1.6.0
Merged pull requests:
- Add abstract types for single- and multi-target MLJ Regressors. (#398) (@atharvas)
- fix: allow for variable
nthreads
(#401) (@MilesCranmer)
Closed issues:
- [Feature] Reusing parts of equation (#113)
v1.5.2
SymbolicRegression v1.5.2
Merged pull requests:
- Broaden MLJ
target_scitype
only when usingTemplateExpression
(#392) (@MilesCranmer) - Change
get_tournament_selection_weights
function signature (#395) (@atharvas) - fix for
turbo
andbumper
not being used in TemplateExpression (#399) (@MilesCranmer)
Closed issues:
- [BUG]: [MLJ Interface]
SRRegressor
likely has too broad atarget_scitype
(#390)
v1.5.1
SymbolicRegression v1.5.1
Merged pull requests:
- CompatHelper: bump compat for DynamicExpressions to 1.9, (keep existing compat) (#391) (@github-actions[bot])
- fix: higher order safe operators (#396) (@MilesCranmer)
- fix: add
literal_pow
for composable expression (#397) (@MilesCranmer)
Closed issues:
- [BUG]: Symbolic regression fails with a dimension mismatch (#389)
v1.5.0
SymbolicRegression v1.5.0
Merged pull requests:
- feat: add safe versions of
asin
andacos
(#388) (@MilesCranmer)
v1.4.0
What's Changed
- Re-use allocations in mutation loop by @MilesCranmer in #387
- Add differential operator by @MilesCranmer in #386
Full Changelog: v1.3.1...v1.4.0
v1.3.1
SymbolicRegression v1.3.1
Merged pull requests:
- refactor: reduce allocations from type assertion (#385) (@MilesCranmer)
v1.3.0
SymbolicRegression v1.3.0
Merged pull requests:
- Expose recorder function for DynamicAutodiff.jl (#377) (@MilesCranmer)
- feat: allow user-specified
stdin
(#382) (@MilesCranmer)
Closed issues:
v1.2.0
SymbolicRegression v1.2.0
Merged pull requests:
- fix: add missing
condition_mutation_weights!
to fix #378 (#379) (@MilesCranmer)
Closed issues:
- [BUG]:
nested_constraints
incompatible withTemplateExpression
(#378)