Signal-to-noise ratio can be defined as the mean value of a signal (S) divided by the standard deviation of the background (N). The signal becomes impossible to see when S/N drops below 2 or 3.
In this demo, a 500-point background of Gaussian white noise is generated with a standard deviation defined by the user. Then the user-defined signal level is added to four consecutive points in the middle of the array, and the entire array is plotted. The mean of the four "signal" points is calculated (S), as well as the standard deviation of the background (N). The actual signal-to-noise ratio (S/N) is calculated and displayed.
(Reference: Skoog, Holler, and Crouch Principles of Instrumental Analysis, 6th Ed Thomson Brooks/Cole 2007)