Any usability testing< method where the evaluator and user participant are not in the same location. Remote evaluation may be moderated, with the evaluator observing the participant in real time, or may be automated or unmoderated with the participant working without direct observation or interaction.
The term "remote evaluation" spans a wide number of detailed methods that collect a range of data. At one extreme, there is little difference from in-person task-based lab testing, except that the moderator and participant are not in the same place. At the other, there are no user tasks at all, and the data collected is aggregated analytics.
Bolt and Tulathumutte diagrammed the methods< on two axes: Qualitative (moderated) vs. Quantitative (unmoderated) and Concrete vs. Conceptual (how closely the method reveals actual behavior on a completed interface).
They group together all qualitative (moderated) methods using remote screen-sharing and audio: a participant and moderator work together in real time. Tools include Adobe Connect, GoTomeeting, NetMeeting, LiveLook, UserVue, Skupe, WebEx, Glance, Youguu.
They organize quantitative (unmoderated) methods for collecting task data on a range from concrete to conceptual methods:
Testing on live sites/apps. Tools include UserZoom, RelevantView, WebEffective, Webnographer
Testing wireframes. Tools include Chalkmark, Usabila
Testing conceptual artifacts. Tools include online card sorting, OptimalSort, WebSort
Quantitative methods without tasks include:
User analytics on live sites. Tools include ClickTale, ClickHeat