In resistance training there are three exercises that are considered “fundamental” and have become the emblem of weightlifting: squat, bench press and deadlift. Focusing more on the aspect of strength training, therefore of increasing strength, we will have to consider very precise methods, programs, overload rates, and recovery times. The first, simpler step is to track your progress and have an objective goal: to increase the weight lifted!

There are tons of benefits to strength training, many of which I have already covered in an article:

At the base of the training, one of the most important factors, is the progressive overload, i.e. the weight must increase from session to session otherwise the exercises will no longer be training. It is therefore necessary, as well as a good habit, to keep track of progress.

## Foundamental exercises

The squat is one of the main exercises in weight training. It is a multi-joint exercise for the development of the lower limbs that also involves the abdomen and torso. Mainly involved buttocks and quadriceps are followed by the hamstrings, abdominal wall, gluteus medius and minor, gastrocnemius and sacrospinal. The movement starts from a standing position by bringing the hips back, then bending the knees and lowering the torso. This clearly without flexing the back. A state of the art squat comes below parallel but there are several variations. Analogous speech for the overload, typically a barbell placed on the trapezes is used but there are many variations.

### The classic one

Another key exercise is the bench press, better known as the bench press. Used in different areas of training it is also one of the three fundamental lifts in powerlifting. The exercise focuses on the pectoralis major by working the anterior delts, anterior serratus and triceps as well. The overload is lowered to chest level and then pushed until the arm is fully extended.

Then there is the deadlift, a truly complete exercise that involves a large part of the muscles of the body. The traditional deadlift can be divided into three phases. It starts with the positioning where you position yourself with the barbell due almost touching it with your legs. The spine remains straight. Then there is the movement in which there is the maximum effort to lift the weight. Finally, the blocking phase, where you are in an upright position by engaging the muscles of the lumbar spine and abdomen, in unison with the buttocks. The exercise ends by carrying out the three phases in reverse and placing the overload back on the ground.

To these three exercises it is also good to add the slow forward and a vertical traction work, such as lat machine or better still the tractions with ballast, and the parallel pushups.

There are an infinite number of exercises but these are the ones that I consider most useful and of which I will want to trace the data below.

## Case study

There is a lot of information in the literature on how, which and how many exercises to do depending on your goal. Different schools exists of thought but most of the principles are shared. However there are many variables to take into account to optimize and make a training path effective and efficient.

In this case, I tried to put some of those principles into practice while tracking progress. For this first part I limited myself to keeping track of the overload and the sets (series and repetitions) of each session. To do this, I implemented a small algorithm in Matlab that allows you to graph this data by making visible the most important information that is often hidden behind a table of numbers. Furthermore, the algorithm also looks for a growth model but this must be read carefully as to be realistic it must take into account several factors. Above all, it is strongly affected by the quantity of data and the level of athletic preparation of the subject.

To get a practical confirmation of all this theory I decided to test it myself.

## Code

The code starts from reading a simple .csv table that can be easily obtained from any spreadsheet and can also be edited from a smartphone. This allows you to update the data even remotely and during the training itself. It is sufficient to call the gym function ('file.csv‘) passing it the table to obtain what is described below. First of all, the table must be structured as follows in order to optimize the algorithm.

The tables can be easily managed in a spreadsheet (Excel, Numbers, etc.) and exported in csv (Matlab adapts quite well to the different table encodings) or directly in Matlab.

The algorithm begins by loading the table into the workspace. It also stores the file name in memory, without extension.

T = readtable(filex);
name=string(name_temp(1));

Subsequently, the SETxREP column must be divided into two vectors containing the relative indexed values. It does the same for the single series where the data is entered as REPETITIONSxWEIGHT. This allows you to have both a maximum estimate considering repetitions and series as programmed by the card (eg 3×10) and to take into account any failed or off-card series. This will be useful in calculating the tonnage.

sxr=string(table2array(T(1:numel(T.Giorno),strmatch('SxR',T.Properties.VariableNames))));
SR=split(sxr,'x');
for i=1:numel(T.Giorno)
SET(i)=str2num(SR(i,1));
REP(i)=str2num(SR(i,2));
end
%...
for k=1:strlength(name)
if k==1
name{1}(k)=upper(name{1}(k));
else
name{1}(k)=lower(name{1}(k));
end
end
for i=1:numel(T.Giorno)
datevec(i)=datetime(T.Giorno{i},'Format','dd-MM-uuuu');
end
T.Date=datevec';
datigrezzi=table2array(T(1:numel(T.Giorno),strmatch('Serie',T.Properties.VariableNames)));
datigrezzi_2=split(datigrezzi,'x');
%---> convertire in numeri
datigrezzirep=datigrezzi_2(:,:,1);
datigrezzirep(cellfun('isempty',datigrezzirep))={'NaN'};
datigrezzipeso=datigrezzi_2(:,:,2);
datigrezzipeso(cellfun('isempty',datigrezzipeso))={'NaN'};
for i=1:numel(datigrezzirep(:,1))
for j=1:numel(datigrezzirep(1,:))
datirep(i,j)=str2num(cell2mat(datigrezzirep(i,j)));
datipeso(i,j)=str2num(cell2mat(datigrezzipeso(i,j)));
end
end


In the same initialization of the code the name of the file is also standardized which reflects that of the exercise. And convert the dates to datetime so that Matlab can actually do some work on it. Then the variables are converted into matrices and vectors so that we can do algebraic operations.

## Graphs and stats

A boxes and whiskers diagram is then plotted. The boxplot() function reads the values from the matrix containing the overload for the series and single days, datipeso(series,day).

figure()
boxplot(datipeso',datetime(datevec,'Format','dd-MM'),'MedianStyle','target')
value=max(max(datipeso))-min(min(datipeso));
ylim([min(min(datipeso))-value max(max(datipeso))+value/2])
hold on
plot(nanmedian(datipeso'),'--r')
hold off
title(name)

The boxplot is a statistically very good graph which in this case can be wasted. The center point is the median and the box represents the quartiles (i.e. the 25th and 75th percentiles). The mustache, on the other hand, represents maximum and minimum. The statistic is rather reductive on a serial number generally between a minimum of 3 and a maximum of 10 per exercise. However, it can give you the idea if on a particular day the same load was used all the time (no box) or if you started from a very low overload to finish most with high load (top center point and long lower whisker) or if the series are low starts to end with high loads (elongated box with central point).

These estimates can help in optimizing training and periodization, as well as allow you to refine the training program and have a visual feedback on the data collected.

Subsequently, the average for each single day (considering the average of the loads) and a weighted average (considering the average of the overload weighted by the number of repetitions) are also plotted.

meanweigth=nansum((datipeso.*datirep)')./nansum(datirep');
&...
figure()
plot(datevec,meanweigth,'or')
hold on
plot(datevec,max(datipeso'),'^g')
plot(datevec,max(datipeso'),'--g')
plot(datevec,meanweigth,'--b')
ylim([min(min(datipeso))-value max(max(datipeso))+value/2])
title(name+' mean vs max','from '+string(datevec(1))+' to '+string(datevec(length(datevec))))
legend('Mean value', 'Max value')


### Plots

A total of 4 graphs are produced (average, weighted average, maximums, comparison) and the averages are then also compared with the maximum values.

A slightly more refined parameter is tonnage. This term, in the context of training, is a way to represent the volume of training. It is calculated as sets x reps x overload. In addition, monthly growth rates are also calculated.

for i=1:numel(datipeso(:,1))
ton(i)=nansum(datipeso(i,:).*datipeso(i,:));
end
figure()
plot(datevec,ton,'--k')
hold on
plot(datevec,ton,'^r')
ylim([(min(ton)-(max(ton)-min(ton))/2) (max(ton)+(max(ton)-min(ton))/2)])
title(name+' ton')
indice=1:4:numel(day_val); %aumenta di 4 considerando le sedute a distanza di 4 settimane (1 mese)
maxpeso=max(datipeso');
pesoincremento=maxpeso(indice);
meseincremento=datetime(datevec(indice),'Format','MMMM');
for i=1:(numel(pesoincremento)-1)
incremento(i)=(pesoincremento(i+1)-pesoincremento(i))/pesoincremento(i);
testoincremento(i)=strcat('increment between'," ",string(meseincremento(i)),' -'," ",string(meseincremento(i+1)),' ='," ",string(incremento(i)*100),'%');
end
testoincremento' 

These parameters can be very useful both for the athlete and for the preparation. Typically they are evaluated in the month but more experienced trainers can also make evaluations within the mesocycle or macrocycle.

The code ends by exporting all the figures both as a pdf file and as a .fig format, which allows them to be treated and further processed in Matlab.

## Growth model for weight

Several polynomial fits are also done. Firstly two polynomial fits are made, linear and quadratic, which allow to have a first estimate of how the loads are going with respect to the linear and quadratic trends, which better approximate the progressive overload.

for i=1:numel(datevec)
day_val(i)=days(datevec(i)-datevec(1));
end
f1=fit(day_val',meanweigth','poly1');
f1_max=fit(day_val',max(datipeso')','poly1');
f2=fit(day_val',meanweigth','poly2');
f2_max=fit(day_val',max(datipeso')','poly2');
figure()
subplot(2,1,1)
plot(linspace(min(day_val),max(day_val),100),f1(linspace(min(day_val),max(day_val),100)),'-r');
hold on
plot(linspace(min(day_val),max(day_val),100),f1_max(linspace(min(day_val),max(day_val),100)),'--g');
plot(day_val,meanweigth,'ob')
hold off
xlim([0 max(day_val)+10])
title(name+' linear growth model');
subplot(2,1,2)
plot(linspace(min(day_val),max(day_val),100),f2(linspace(min(day_val),max(day_val),100)),'-r');
hold on
plot(linspace(min(day_val),max(day_val),100),f2_max(linspace(min(day_val),max(day_val),100)),'--g');
plot(day_val,meanweigth,'ob')
xlabel('Days')
legend('Mean growth model','Max growth model','Mean value','Location','southeast')
xlim([0 max(day_val)+10])
%
fp=polyfit(day_val,meanweigth,degree);
plotfp = polyval(fp,linspace(min(day_val),max(day_val)*1.2,200));
figure()
plot(day_val,meanweigth,'or')
hold on
plot(linspace(min(day_val),max(day_val)*1.2,200),plotfp,'b')
poly=''
for i=1:degree+1
if i==degree+1
poly=strcat(poly,'+',string(fp(i)));
elseif i==1
poly=strcat(string(fp(i)),'\cdot x^',string(degree+1-i));
else
poly=strcat(poly,'+',string(fp(i)),'\cdot x^',string(degree+1-i));
end
end
poly=strrep(poly,'x^1','x'); %sisitemiamo il polinomio
polynom=strcat('$',poly,'$');
title(name+', '+ degree + ' degree poly growth model :=',polynom);
subtitle(polynom,'Interpreter','latex')
% 2D interp
for i=1:numel(datipeso(1,:))
D1(:,i)=[day_val'];
end
for i=1:numel(datipeso(:,1))
R1(i,:)=1:1:numel(datipeso(1,:));
end
D=D1(:);
R=R1(:);
dati=datipeso(:);
dati(isnan(dati))=0;
%
f=fit([D, R],dati(:),'linearinterp');
figure()
plot(f,[D, R],dati(:))
hold on
ylabel('Repetition')
xlabel('Days')
zlabel('Weigth')
zlim([mean(dati(:))-(max(dati(:))/2) mean(dati(:))+(max(dati(:))/2)])
title(name+' days vs repetition');

### 2D fitting

In conclusion, a 2D polynomial fit is also done which allows to have a two-dimensional graph and to compare the trend of loads both in the time domain and in the different series. Printed does not have much value, except for very good shots, but being able to interact allows you to have a truly complete view.

In the end, there is a fit with a polynomial of variable order (degree). This allows you to choose the order of the polynome that approximates the values. So this is calculated and allows to have a first estimate of the actual growth model. Using this model, short / medium-term progress forecasts could also be made by inserting the days that have passed since the first recorded day. It can be a first aid to estimate, not so much the training loads, but the optimal targets on which to program the following mesocycles.

### Mathematical model

In particular, a real polynomial is obtained. Like:

\text{Squat growth model:=} 0.50478\cdot x+70.7626\\
\text{Pull up growth model:=} 0.1079\cdot x+8.7004

On this growth model it is better to make some clarifications.

First, this model is variable and takes several months of data collection to establish. On the basis of the model alone, it is not possible to reach any conclusion, however it can be integrated with the other information of a training plan making it even more valid overall.

If on the one hand the model requires a lot of data, on the other hand it goes against human physiology. Initially, the increments can be very high for a beginner. In the literature, 30% to 70% are indicated in a few months. These increases then drop for intermediate athletes down to a few percentage points per year for very advanced athletes. Therefore it is very difficult to estimate a single model.

It is also true that it is difficult to have such complete data sets so it is useful to exploit the model only on the reference set to visualize the growth trend rather than making actual estimates. Limiting itself to a period of a few months, the best model is clearly the linear one, or at most a very broad quadratic trend.

## Disclaimer

The training plan is not shown in the article. This plan was structured by a professional trainer evaluating various aspects such as the time available, additional types of training, physicality and personal goals and various injuries.

The datasets of the various exercises involve different time periods due to injuries, so it is neither sensible nor possible to correlate them with each other. For the validity, any corrections in the growth models and all the final results this will be discussed in another article. Charts may have been updated even after the publication date.

Most datasets involve beginner / intermediate loads where the overload increases are much more pronounced than in a professional athlete.

## Code

The full code is available on GitHub: