\section{Validation}
\label{sec:Validation}

\begin{figure*}[t]
	\center
	\resizebox{\textwidth}{!}{\includegraphics{results.pdf}}	
	\caption{Performance Results}
	\label{fig:results}
	%\vspace{-0.5cm}
\end{figure*}


To validate our approach, we considered two use case applications, namely our \textsc{MS} running example of Section \ref{sec:Application-Example}, and an Auction Sales Management System (\textsc{Asms}). The \textsc{Asms} consists of 122 classes, 797 methods and about 11 kLoC. The \textsc{Asms} implements an Auction system, where users can buy or sell products online, after joining an auction and placing bids. Users can also post, or read, comments from the Auction session. In the evaluation, we specifically targeted performance-related research questions:
\begin{itemize}
	\item does the tool perform sufficiently well to be used in practice? 
	\item what are the main factors impacting the tool performance? 
\end{itemize}
To answer these questions, we defined a security policy for each example application and ran a scenario covering the different policy management operations, i.e. obligation activation, violation, access control, etc. We evaluated three factors: the time necessary to perform policy management operations, to evaluate an access request, and to update the (obligation) policy state, for different sizes of the application and policy states (number of activated obligations). 

Figure \ref{fig:results} shows the results: policy management operations and access request evaluations are performed in a few milliseconds, and represent an almost constant overhead. On the other hand, obligation processing time increases with the number of (activated) obligations in the system: the activated obligations' contexts have to be verified individually to check whether they are canceled, violated or fulfilled, after each state update. We are currently investigating ways to improve this processing time.

