---
layout: post
title: 'So many names '
date: '2014-12-26T02:20:00.000-08:00'
author: Alex
tags:
- Machine Learning
- Terminology
modified_time: '2015-02-08T09:53:32.609-08:00'
blogger_id: tag:blogger.com,1999:blog-307916792578626510.post-5723196622826928884
blogger_orig_url: http://brilliantlywrong.blogspot.com/2014/12/so-many-names.html
---

<div>It is often for some simple term to have may different names. But terms used in Machine Learning are absolutely special.<br /><br />ROC curve! Receiver operating characteristic, sounds like something terribly new.<br />But this is the same as P-P plot in stats.<br />(HEP people draw it mirrored frequently)<br /><br />What about tpr? True positive rate in machine learning. But HEP people use 'signal efficiency', or just 's'.<br />And survival function in statistic means really the same, though rarely applied to comparison of distributions.<br /><br /><b>Log-likelihood</b> in stats was renamed to <b>logloss</b> in machine learning. Those who train neural networks, usually call it <b>cross-entropy</b> loss function, but scikit-learn developers call this <b>binomial deviance</b>. And I am still not sure that I know all the names.<br /><br />For the first time such things just drive you crazy. Later there is no difference for you.<br />But this turns into real problem much later, because the same things are discussed in different terms, and you do not know which one to use when searching on the internet.<br /><br />
</div>