Fix small spelling inconsistencies

This commit is contained in:
Jip J. Dekker 2021-07-25 15:32:42 +10:00
parent 3325a77cc1
commit c53d492e11
No known key found for this signature in database
GPG Key ID: 517DF4A00618C9C3
3 changed files with 4 additions and 4 deletions

View File

@ -805,8 +805,8 @@ Times are averages of 10 runs.
Points below the line indicate that the new system is faster.
On average, the new system achieves a speed-up of 5.5, with every \instance{} achieving at least 2.5 speed-up and multiple \instances{} with a speed-up of over 100.
In terms of memory performance (\cref{sfig:rew-comparemem}), version 2.5.5 can sometimes still outperform the new prototype.
We have identified that the main memory bottlenecks are our currently unoptimised implementations of \gls{cse} lookup tables and argument vectors.
These are very encouraging results, given that we are comparing a largely unoptimised prototype to a mature piece of software.
We have identified that the main memory bottlenecks are our currently unoptimized implementations of \gls{cse} lookup tables and argument vectors.
These are very encouraging results, given that we are comparing a largely unoptimized prototype to a mature piece of software.
\begin{figure}
\centering

View File

@ -560,7 +560,7 @@ For one of the \solvers{}, we also compare with an oracle approach that can dire
As such, we show that the use of \gls{rbmo} introduces an insignificant computational overhead.
Our second experiment compares the performance of using \minizinc{} incrementally.
We compare our two methods, \gls{rbmo} and the incremental constraint modelling interface, against the baseline of continuously \gls{rewriting} and reinitialising the \solver{}.
We compare our two methods, \gls{rbmo} and the incremental constraint modelling interface, against the baseline of continuously \gls{rewriting} and reinitializing the \solver{}.
For this comparison compare the time that is required to (repeatedly) rewrite an \instance{} and the time required by the \solver{}.
The first model contains a lexicographic objective.
The second model is shared between the two experiments and uses a round-robin \gls{lns} approach.

View File

@ -128,7 +128,7 @@ Since multiple iterations of \gls{meta-optimization} typically share large parts
As an additional improvement, the changes observed in the \gls{slv-mod} can be incrementally applied within the \solver{}.
Ideally, the \solver{} can fully support the incremental changes made to the \gls{slv-mod}.
This avoids the overhead of re-initialisation and allows the solver to retain all search information.
This avoids the overhead of reinitialization and allows the solver to retain all search information.
Otherwise, the \solver{} can still be warm-started.
Instead of starting the search without any information, the \solver{} is given information about the previous \gls{sol} to speed up its search.