448624
apa
50
date
year
1
10
https://hcigroup.org/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3A%22zotpress-ea1b45cac1226bed67afc4a87700bdd3%22%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22UIZNXUED%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Naicker%20et%20al.%22%2C%22parsedDate%22%3A%222016-09-26%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ENaicker%2C%20P.%2C%20Anoopkumar-Dukie%2C%20S.%2C%20Grant%2C%20G.%20D.%2C%20%26amp%3B%20Kavanagh%2C%20J.%20J.%20%282016%29.%20Medications%20influencing%20central%20cholinergic%20neurotransmission%20affect%20saccadic%20and%20smooth%20pursuit%20eye%20movements%20in%20healthy%20young%20adults.%20%3Ci%3EPsychopharmacology%3C%5C%2Fi%3E.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs00213-016-4436-1%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs00213-016-4436-1%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Medications%20influencing%20central%20cholinergic%20neurotransmission%20affect%20saccadic%20and%20smooth%20pursuit%20eye%20movements%20in%20healthy%20young%20adults%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Preshanta%22%2C%22lastName%22%3A%22Naicker%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shailendra%22%2C%22lastName%22%3A%22Anoopkumar-Dukie%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gary%20D.%22%2C%22lastName%22%3A%22Grant%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Justin%20J.%22%2C%22lastName%22%3A%22Kavanagh%22%7D%5D%2C%22abstractNote%22%3A%22RATIONALE%3A%20Acetylcholine%20is%20an%20important%20neuromodulator%20in%20the%20central%20nervous%20system%2C%20where%20it%20plays%20a%20significant%20role%20in%20central%20functions%20such%20as%20the%20regulation%20of%20movement.%5CnOBJECTIVE%3A%20This%20study%20investigated%20the%20pharmacological%20effects%20of%20over-the-counter%20anticholinergic%20medications%20on%20saccadic%20and%20smooth%20pursuit%20eye%20movements%2C%20in%20order%20to%20establish%20the%20significance%20of%20central%20cholinergic%20pathways%20in%20the%20control%20of%20these%20centrally%20regulated%20oculomotor%20processes.%5CnMETHODS%3A%20Sixteen%20subjects%20%28mean%20age%2023%5Cu00a0%5Cu00b1%5Cu00a03%5Cu00a0years%2C%209%20females%29%20performed%20pro-saccadic%2C%20anti-saccadic%20and%20smooth%20pursuit%20eye%20movement%20tests%2C%20while%20an%20eye%20tracker%20collected%20eye%20movement%20data.%20Oculomotor%20assessments%20were%20performed%20pre-ingestion%2C%200.5%20and%202%5Cu00a0h%20post-ingestion%20of%20drugs%20with%20varying%20degrees%20of%20central%20anticholinergic%20properties.%20The%20drugs%20tested%20were%20promethazine%2C%20hyoscine%20hydrobromide%2C%20hyoscine%20butylbromide%20and%20placebo.%5CnRESULTS%3A%20The%20drug%20intervention%20with%20stronger%20central%20anticholinergic%20properties%2C%20promethazine%2C%20decreased%20amplitude%20and%20increased%20velocity%20in%20the%20pro-saccadic%20task%20and%20increased%20duration%20in%20the%20anti-saccadic%20task.%20Promethazine%2C%20once%20again%2C%20was%20the%20only%20drug%20to%20decrease%20eye%20velocity%20in%20the%20smooth%20pursuit%20test.%5CnCONCLUSION%3A%20The%20prominent%20effects%20of%20the%20stronger%20central%20anticholinergic%20promethazine%2C%20on%20saccadic%20and%20smooth%20pursuit%20eye%20movements%2C%20potentially%20conveys%20the%20significance%20of%20central%20cholinergic%20pathways%20in%20the%20control%20of%20these%20centrally%20regulated%20oculomotor%20processes.%22%2C%22date%22%3A%22Sep%2026%2C%202016%22%2C%22language%22%3A%22ENG%22%2C%22DOI%22%3A%2210.1007%5C%2Fs00213-016-4436-1%22%2C%22ISSN%22%3A%221432-2072%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%5D%2C%22dateModified%22%3A%222016-12-03T22%3A21%3A05Z%22%7D%7D%2C%7B%22key%22%3A%22DRB8CWBG%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Tong%20et%20al.%22%2C%22parsedDate%22%3A%222015-09%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ETong%2C%20I.%2C%20Mohareri%2C%20O.%2C%20Tatasurya%2C%20S.%2C%20Hennessey%2C%20C.%2C%20%26amp%3B%20Salcudean%2C%20S.%20%282015%29.%20A%20retrofit%20eye%20gaze%20tracker%20for%20the%20da%20Vinci%20and%20its%20integration%20in%20task%20execution%20using%20the%20da%20Vinci%20Research%20Kit.%20%3Ci%3E2015%20IEEE%5C%2FRSJ%20International%20Conference%20on%20Intelligent%20Robots%20and%20Systems%20%28IROS%29%3C%5C%2Fi%3E%2C%202043%26%23×2013%3B2050.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FIROS.2015.7353648%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FIROS.2015.7353648%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22A%20retrofit%20eye%20gaze%20tracker%20for%20the%20da%20Vinci%20and%20its%20integration%20in%20task%20execution%20using%20the%20da%20Vinci%20Research%20Kit%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22I.%22%2C%22lastName%22%3A%22Tong%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22O.%22%2C%22lastName%22%3A%22Mohareri%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22S.%22%2C%22lastName%22%3A%22Tatasurya%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22C.%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22S.%22%2C%22lastName%22%3A%22Salcudean%22%7D%5D%2C%22abstractNote%22%3A%22The%20integration%20of%20eye-gaze%20tracking%20at%20the%20console%20of%20surgical%20robots%20has%20the%20potential%20to%20add%20both%20speed%20and%20functionality%20to%20the%20human-robot%20interface.%20In%20this%20paper%2C%20we%20present%20a%20novel%20eye%20gaze%20tracker%20that%20can%20be%20integrated%20as%20a%20simple%20retrofit%20to%20the%20da%20Vinci%20console.%20In%20particular%2C%20the%20eye%20tracker%20can%20be%20used%20with%20the%20da%20Vinci%20Research%20Kit%20%28dVRK%29%20to%20control%20the%20patient%20side%20manipulators.%20We%20present%20the%20eye-tracker%20design%20and%20calibration.%20First%2C%20a%202D%20calibration%20is%20carried%20out%20to%20estimate%20the%20gaze%20on%20the%20da%20Vinci%27s%20stereoscopic%20display%2C%20followed%20by%20a%203D%20%5Cu201chand-eye%5Cu201d%20calibration%20to%20estimate%20the%20gaze%20in%20the%20surgical%20scene.%20Using%20the%20dVRK%2C%20we%20demonstrated%20the%20use%20of%20the%20eye-tracker%20to%20perform%20a%20gaze-assisted%20task%2C%20in%20which%20the%20estimated%20gaze%20is%20used%20as%20a%20set%20point%20for%20the%20robot%20and%20the%20user%20is%20guided%20towards%20the%20point%20of%20gaze%20through%20haptic%20feedback.%20The%20task%20performed%20was%20peg%20placement%20and%20was%20evaluated%20in%20a%20preliminary%20user%20study%20with%20five%20users.%22%2C%22date%22%3A%22Sept%202015%22%2C%22proceedingsTitle%22%3A%222015%20IEEE%5C%2FRSJ%20International%20Conference%20on%20Intelligent%20Robots%20and%20Systems%20%28IROS%29%22%2C%22conferenceName%22%3A%222015%20IEEE%5C%2FRSJ%20International%20Conference%20on%20Intelligent%20Robots%20and%20Systems%20%28IROS%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FIROS.2015.7353648%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22V6KKNXIH%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A13%3A25Z%22%7D%7D%2C%7B%22key%22%3A%224E6V4BSZ%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Robillard%20et%20al.%22%2C%22parsedDate%22%3A%222015-04-17%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ERobillard%2C%20J.%20M.%2C%20Cabral%2C%20E.%2C%20Hennessey%2C%20C.%2C%20Kwon%2C%20B.%20K.%2C%20%26amp%3B%20Illes%2C%20J.%20%282015%29.%20Fueling%20Hope%3A%20Stem%20Cells%20in%20Social%20Media.%20%3Ci%3EStem%20Cell%20Reviews%20and%20Reports%3C%5C%2Fi%3E%2C%20%3Ci%3E11%3C%5C%2Fi%3E%284%29%2C%20540%26%23×2013%3B546.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs12015-015-9591-y%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs12015-015-9591-y%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Fueling%20Hope%3A%20Stem%20Cells%20in%20Social%20Media%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Julie%20M.%22%2C%22lastName%22%3A%22Robillard%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emanuel%22%2C%22lastName%22%3A%22Cabral%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Craig%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Brian%20K.%22%2C%22lastName%22%3A%22Kwon%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Judy%22%2C%22lastName%22%3A%22Illes%22%7D%5D%2C%22abstractNote%22%3A%22Social%20media%20is%20broadening%20opportunities%20to%20engage%20in%20discussions%20about%20biomedical%20advances%20such%20as%20stem%20cell%20research.%20However%2C%20little%20is%20known%20about%20how%20information%20pertaining%20to%20stem%20cells%20is%20disseminated%20on%20platforms%20such%20as%20Twitter.%20To%20fill%20this%20gap%2C%20we%20conducted%20a%20content%20analysis%20of%20tweets%20containing%20%28i%29%20a%20stem%20cell%20keyword%2C%20and%20%28ii%29%20a%20keyword%20related%20to%20either%20spinal%20cord%20injury%20%28SCI%29%20or%20Parkinson%20disease%20%28PD%29.%20We%20found%20that%20the%20discussion%20about%20stem%20cells%20and%20SCI%20or%20PD%20revolves%20around%20different%20aspects%20of%20the%20research%20process.%20We%20also%20found%20that%20the%20tone%20of%20most%20tweets%20about%20stem%20cells%20is%20either%20positive%20or%20neutral.%20The%20findings%20contribute%20new%20knowledge%20about%20Twitter%20as%20a%20connecting%20platform%20for%20many%20voices%20and%20as%20a%20key%20tool%20for%20the%20dissemination%20of%20information%20about%20stem%20cells%20and%20disorders%20of%20the%20central%20nervous%20system.%22%2C%22date%22%3A%222015%5C%2F04%5C%2F17%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs12015-015-9591-y%22%2C%22ISSN%22%3A%221550-8943%2C%201558-6804%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Flink.springer.com%5C%2Farticle%5C%2F10.1007%5C%2Fs12015-015-9591-y%22%2C%22collections%22%3A%5B%22PZW7FIZK%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A12%3A06Z%22%7D%7D%2C%7B%22key%22%3A%22C7RGKCZW%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessy%22%2C%22parsedDate%22%3A%222014-12-16%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessy%2C%20C.%20A.%20%282014%29.%20%3Ci%3ESystem%20and%20method%20for%20analyzing%20three-dimensional%20%283D%29%20media%20content%3C%5C%2Fi%3E%20%28Patent%20No.%20US8913790%20B2%29.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS8913790%27%3Ehttp%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS8913790%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22patent%22%2C%22title%22%3A%22System%20and%20method%20for%20analyzing%20three-dimensional%20%283D%29%20media%20content%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Craig%20Adam%22%2C%22lastName%22%3A%22Hennessy%22%7D%5D%2C%22abstractNote%22%3A%22A%20system%20and%20method%20are%20provided%20that%20use%20point%20of%20gaze%20information%20to%20determine%20what%20portions%20of%203D%20media%20content%20are%20actually%20being%20viewed%20to%20enable%20a%203D%20media%20content%20viewing%20experience%20to%20be%20improved.%20Tracking%20eye%20movements%20of%20viewers%20to%20obtain%20such%20point%20of%20gaze%20information%20are%20used%20to%20control%20characteristics%20of%20the%203D%20media%20content%20during%20consumption%20of%20that%20media%2C%20and%5C%2For%20to%20improve%20or%20otherwise%20adjust%20or%20refine%20the%203D%20media%20content%20during%20creation%20thereof%20by%20a%20media%20content%20provider.%20Outputs%20may%20be%20generated%20to%20illustrate%20what%20in%20the%203D%20media%20content%20was%20viewed%20at%20incorrect%20depths.%20Such%20outputs%20may%20then%20be%20used%20in%20subsequent%20or%20offline%20analysis%2C%20e.g.%2C%20by%20editors%20for%20media%20content%20providers%20when%20generating%20the%203D%20media%20itself%2C%20in%20order%20to%20gauge%20the%203D%20effects.%20A%20quality%20metric%20can%20be%20computed%20based%20on%20the%20point%20of%20gaze%20information%2C%20which%20can%20be%20used%20to%20analyze%20the%20interactions%20between%20viewers%20and%20the%203D%20media%20content%20being%20displayed.%20The%20quality%20metric%20may%20also%20be%20calibrated%20in%20order%20to%20accommodate%20offsets%20and%20other%20factors%20and%5C%2For%20to%20allow%20for%20aggregation%20of%20results%20obtained%20for%20multiple%20viewers.%22%2C%22country%22%3A%22United%20States%22%2C%22assignee%22%3A%22Mirametrix%20Inc.%22%2C%22issuingAuthority%22%3A%22%22%2C%22patentNumber%22%3A%22US8913790%20B2%22%2C%22filingDate%22%3A%22Feb%2011%2C%202013%22%2C%22applicationNumber%22%3A%22%22%2C%22priorityNumbers%22%3A%22%22%2C%22issueDate%22%3A%22Dec%2016%2C%202014%22%2C%22references%22%3A%22%22%2C%22legalStatus%22%3A%22%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS8913790%22%2C%22collections%22%3A%5B%22KAZ2KUXT%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A13%3A36Z%22%7D%7D%2C%7B%22key%22%3A%22D5I9A8WP%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%20et%20al.%22%2C%22parsedDate%22%3A%222014-10-30%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%20A.%2C%20Fiset%2C%20J.%2C%20%26amp%3B%20Sullivan%2C%20N.%20%282014%29.%20%3Ci%3ESystem%20and%20Method%20For%20Calibrating%20Eye%20Gaze%20Data%3C%5C%2Fi%3E%20%28Patent%20No.%20US20140320397%20A1%29.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20140320397%27%3Ehttp%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20140320397%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22patent%22%2C%22title%22%3A%22System%20and%20Method%20For%20Calibrating%20Eye%20Gaze%20Data%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Craig%20A.%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Jacob%22%2C%22lastName%22%3A%22Fiset%22%7D%2C%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Nicholas%22%2C%22lastName%22%3A%22Sullivan%22%7D%5D%2C%22abstractNote%22%3A%22A%20system%20and%20method%20are%20provided%20for%20calibrating%20an%20eye%20gaze%20tracking%20system.%20The%20method%20comprises%20obtaining%20gaze%20data%3B%20obtaining%20at%20least%20one%20key%20point%20corresponding%20to%20a%20portion%20of%20media%20content%20being%20displayed%3B%20linking%20the%20gaze%20data%20to%20the%20at%20least%20one%20key%20point%3B%20and%20generating%20one%20or%20more%20calibration%20parameters%20by%20comparing%20gaze%20data%20with%20associated%20ones%20of%20the%20at%20least%20one%20key%20point.%22%2C%22country%22%3A%22United%20States%22%2C%22assignee%22%3A%22Mirametrix%20Inc.%22%2C%22issuingAuthority%22%3A%22%22%2C%22patentNumber%22%3A%22US20140320397%20A1%22%2C%22filingDate%22%3A%22Apr%2028%2C%202014%22%2C%22applicationNumber%22%3A%22%22%2C%22priorityNumbers%22%3A%22%22%2C%22issueDate%22%3A%22Oct%2030%2C%202014%22%2C%22references%22%3A%22%22%2C%22legalStatus%22%3A%22%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20140320397%22%2C%22collections%22%3A%5B%22KAZ2KUXT%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A12%3A18Z%22%7D%7D%2C%7B%22key%22%3A%22NTZSSEW6%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%20et%20al.%22%2C%22parsedDate%22%3A%222014-07-03%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%20A.%2C%20Fiset%2C%20J.%2C%20%26amp%3B%20ST-HILAIRE%2C%20S.%20%282014%29.%20%3Ci%3ESystem%20and%20Method%20for%20Using%20Eye%20Gaze%20Information%20to%20Enhance%20Interactions%3C%5C%2Fi%3E%20%28Patent%20No.%20US20140184550%20A1%29.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20140184550%27%3Ehttp%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20140184550%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22patent%22%2C%22title%22%3A%22System%20and%20Method%20for%20Using%20Eye%20Gaze%20Information%20to%20Enhance%20Interactions%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Craig%20A.%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Jacob%22%2C%22lastName%22%3A%22Fiset%22%7D%2C%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Simon%22%2C%22lastName%22%3A%22ST-HILAIRE%22%7D%5D%2C%22abstractNote%22%3A%22A%20system%20and%20method%20are%20provided%20for%20enhancing%20inputs%20or%20interactions.%20The%20method%20comprises%20correlating%20gaze%20information%20for%20a%20subject%20to%20information%20corresponding%20to%20an%20environment%3B%20and%20providing%20an%20enhancement%20to%20an%20input%20or%20interaction%20between%20the%20subject%20and%20the%20environment.%20A%20system%20and%20method%20are%20also%20provided%20for%20enabling%20enhanced%20inputs%20or%20interactions%20with%20objects%20in%20an%20environment.%20The%20method%20comprises%20correlating%20gaze%20information%20for%20a%20subject%20to%20a%20registration%20input%20corresponding%20to%20an%20object%20in%20the%20environment%3B%20and%20registering%20a%20position%20of%20the%20object%20in%20the%20environment%20using%20the%20gaze%20information.%22%2C%22country%22%3A%22United%20States%22%2C%22assignee%22%3A%22Tandemlaunch%20Technologies%20Inc.%22%2C%22issuingAuthority%22%3A%22%22%2C%22patentNumber%22%3A%22US20140184550%20A1%22%2C%22filingDate%22%3A%22Mar%207%2C%202014%22%2C%22applicationNumber%22%3A%22%22%2C%22priorityNumbers%22%3A%22%22%2C%22issueDate%22%3A%22Jul%203%2C%202014%22%2C%22references%22%3A%22%22%2C%22legalStatus%22%3A%22%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20140184550%22%2C%22collections%22%3A%5B%22KAZ2KUXT%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A10%3A57Z%22%7D%7D%2C%7B%22key%22%3A%22EPZJNEUG%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%20and%20Fiset%22%2C%22parsedDate%22%3A%222013-09-12%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%20A.%2C%20%26amp%3B%20Fiset%2C%20J.%20%282013%29.%20%3Ci%3ESystem%20and%20Method%20for%20Interacting%20with%20and%20Analyzing%20Media%20on%20a%20Display%20Using%20Eye%20Gaze%20Tracking%3C%5C%2Fi%3E%20%28Patent%20No.%20US20130235347%20A1%29.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20130235347%27%3Ehttp%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20130235347%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22patent%22%2C%22title%22%3A%22System%20and%20Method%20for%20Interacting%20with%20and%20Analyzing%20Media%20on%20a%20Display%20Using%20Eye%20Gaze%20Tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Craig%20A.%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Jacob%22%2C%22lastName%22%3A%22Fiset%22%7D%5D%2C%22abstractNote%22%3A%22A%20system%20and%20method%20are%20provided%20that%20use%20eye%20gaze%20as%20a%20pointing%20or%20selection%20tool%2C%20which%20enables%20hands-free%20operation%20of%20a%20display%20such%20as%20a%20television.%20wherein%20the%20use%20of%20eye%20gaze%20as%20an%20input%20can%20also%20lead%20to%20easier%20and%20faster%20interactions%20when%20compared%20to%20traditional%20remote%20controls.%20A%20system%20and%20method%20are%20also%20provided%20that%20use%20eye%20tracking%20on%20displays%20such%20as%20televisions%20to%20determine%20what%20content%20was%20viewed%20and%2C%20by%20association%2C%20what%20content%20was%20of%20most%20interest%20to%20the%20user.%20Systems%20and%20methods%20are%20also%20described%20that%20enable%20interaction%20with%20elements%20displayed%20in%20an%20augmented%20reality%20environment%20using%20gaze%20tracking%20and%20for%20controlling%20gaze%20tracking%20on%20a%20portable%20electronic%20device.%22%2C%22country%22%3A%22United%20States%22%2C%22assignee%22%3A%22Tandemlaunch%20Technologies%20Inc.%22%2C%22issuingAuthority%22%3A%22%22%2C%22patentNumber%22%3A%22US20130235347%20A1%22%2C%22filingDate%22%3A%22Apr%2025%2C%202013%22%2C%22applicationNumber%22%3A%22%22%2C%22priorityNumbers%22%3A%22%22%2C%22issueDate%22%3A%22Sep%2012%2C%202013%22%2C%22references%22%3A%22%22%2C%22legalStatus%22%3A%22%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20130235347%22%2C%22collections%22%3A%5B%22KAZ2KUXT%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A07%3A15Z%22%7D%7D%2C%7B%22key%22%3A%22DNZ8RQFJ%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Robillard%20et%20al.%22%2C%22parsedDate%22%3A%222013-07-26%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ERobillard%2C%20J.%20M.%2C%20Johnson%2C%20T.%20W.%2C%20Hennessey%2C%20C.%2C%20Beattie%2C%20B.%20L.%2C%20%26amp%3B%20Illes%2C%20J.%20%282013%29.%20Aging%202.0%3A%20Health%20Information%20about%20Dementia%20on%20Twitter.%20%3Ci%3EPLOS%20ONE%3C%5C%2Fi%3E%2C%20%3Ci%3E8%3C%5C%2Fi%3E%287%29%2C%20e69861.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1371%5C%2Fjournal.pone.0069861%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1371%5C%2Fjournal.pone.0069861%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Aging%202.0%3A%20Health%20Information%20about%20Dementia%20on%20Twitter%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Julie%20M.%22%2C%22lastName%22%3A%22Robillard%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Thomas%20W.%22%2C%22lastName%22%3A%22Johnson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Craig%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22B.%20Lynn%22%2C%22lastName%22%3A%22Beattie%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Judy%22%2C%22lastName%22%3A%22Illes%22%7D%5D%2C%22abstractNote%22%3A%22Online%20social%20media%20is%20widespread%2C%20easily%20accessible%20and%20attracts%20a%20global%20audience%20with%20a%20widening%20demographic.%20As%20a%20large%20proportion%20of%20adults%20now%20seek%20health%20information%20online%20and%20through%20social%20media%20applications%2C%20communication%20about%20health%20has%20become%20increasingly%20interactive%20and%20dynamic.%20Online%20health%20information%20has%20the%20potential%20to%20significantly%20impact%20public%20health%2C%20especially%20as%20the%20population%20gets%20older%20and%20the%20prevalence%20of%20dementia%20increases.%20However%2C%20little%20is%20known%20about%20how%20information%20pertaining%20to%20age-associated%20diseases%20is%20disseminated%20on%20popular%20social%20media%20platforms.%20To%20fill%20this%20knowledge%20gap%2C%20we%20examined%20empirically%3A%20%28i%29%20who%20is%20using%20social%20media%20to%20share%20information%20about%20dementia%2C%20%28ii%29%20what%20sources%20of%20information%20about%20dementia%20are%20promoted%2C%20and%20%28iii%29%20which%20dementia%20themes%20dominate%20the%20discussion.%20We%20data-mined%20the%20microblogging%20platform%20Twitter%20for%20content%20containing%20dementia-related%20keywords%20for%20a%20period%20of%2024%20hours%20and%20retrieved%20over%209%2C200%20tweets.%20A%20coding%20guide%20was%20developed%20and%20content%20analysis%20conducted%20on%20a%20random%20sample%20%2810%25%29%2C%20and%20on%20a%20subsample%20from%20top%20users%5Cu2019%20tweets%20to%20assess%20impact.%20We%20found%20that%20a%20majority%20of%20tweets%20contained%20a%20link%20to%20a%20third%20party%20site%20rather%20than%20personal%20information%2C%20and%20these%20links%20redirected%20mainly%20to%20news%20sites%20and%20health%20information%20sites.%20As%20well%2C%20a%20large%20number%20of%20tweets%20discussed%20recent%20research%20findings%20related%20to%20the%20prediction%20and%20risk%20management%20of%20Alzheimer%5Cu2019s%20disease.%20The%20results%20highlight%20the%20need%20for%20the%20dementia%20research%20community%20to%20harness%20the%20reach%20of%20this%20medium%20and%20its%20potential%20as%20a%20tool%20for%20multidirectional%20engagement.%22%2C%22date%22%3A%22Jul%2026%2C%202013%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1371%5C%2Fjournal.pone.0069861%22%2C%22ISSN%22%3A%221932-6203%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fjournals.plos.org%5C%2Fplosone%5C%2Farticle%3Fid%3D10.1371%5C%2Fjournal.pone.0069861%22%2C%22collections%22%3A%5B%22PZW7FIZK%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A06%3A05Z%22%7D%7D%2C%7B%22key%22%3A%22SHDEAXQM%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%20and%20Lawrence%22%2C%22parsedDate%22%3A%222013-06-04%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%20A.%2C%20%26amp%3B%20Lawrence%2C%20P.%20D.%20%282013%29.%20%3Ci%3EMethods%20and%20apparatus%20for%20estimating%20point-of-gaze%20in%20three%20dimensions%3C%5C%2Fi%3E%20%28Patent%20No.%20US8457352%20B2%29.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS8457352%27%3Ehttp%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS8457352%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22patent%22%2C%22title%22%3A%22Methods%20and%20apparatus%20for%20estimating%20point-of-gaze%20in%20three%20dimensions%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Craig%20A.%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Peter%20D.%22%2C%22lastName%22%3A%22Lawrence%22%7D%5D%2C%22abstractNote%22%3A%22Methods%20for%20determining%20a%20point-of-gaze%20%28POG%29%20of%20a%20user%20in%20three%20dimensions%20are%20disclosed.%20In%20particular%20embodiments%2C%20the%20methods%20involve%3A%20presenting%20a%20three-dimensional%20scene%20to%20both%20eyes%20of%20the%20user%3B%20capturing%20image%20data%20including%20both%20eyes%20of%20the%20user%3B%20estimating%20first%20and%20second%20line-of-sight%20%28LOS%29%20vectors%20in%20a%20three-dimensional%20coordinate%20system%20for%20the%20user%27s%20first%20and%20second%20eyes%20based%20on%20the%20image%20data%3B%20and%20determining%20the%20POG%20in%20the%20three-dimensional%20coordinate%20system%20using%20the%20first%20and%20second%20LOS%20vectors.%22%2C%22country%22%3A%22United%20States%22%2C%22assignee%22%3A%22The%20University%20Of%20British%20Columbia%22%2C%22issuingAuthority%22%3A%22%22%2C%22patentNumber%22%3A%22US8457352%20B2%22%2C%22filingDate%22%3A%22May%2023%2C%202008%22%2C%22applicationNumber%22%3A%22%22%2C%22priorityNumbers%22%3A%22%22%2C%22issueDate%22%3A%22Jun%204%2C%202013%22%2C%22references%22%3A%22%22%2C%22legalStatus%22%3A%22%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS8457352%22%2C%22collections%22%3A%5B%22KAZ2KUXT%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A12%3A27Z%22%7D%7D%2C%7B%22key%22%3A%223SSEMPAH%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%20and%20Fiset%22%2C%22parsedDate%22%3A%222012%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%2C%20%26amp%3B%20Fiset%2C%20J.%20%282012%29.%20Long%20Range%20Eye%20Tracking%3A%20Bringing%20Eye%20Tracking%20into%20the%20Living%20Room.%20%3Ci%3EProceedings%20of%20the%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%3C%5C%2Fi%3E%2C%20249%26%23×2013%3B252.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F2168556.2168608%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F2168556.2168608%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Long%20Range%20Eye%20Tracking%3A%20Bringing%20Eye%20Tracking%20into%20the%20Living%20Room%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Craig%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jacob%22%2C%22lastName%22%3A%22Fiset%22%7D%5D%2C%22abstractNote%22%3A%22The%20demand%20for%20improved%20human%20computer%20interaction%20will%20lead%20to%20increasing%20adoption%20of%20eye%20tracking%20in%20everyday%20devices.%20For%20interaction%20with%20devices%20such%20as%20Smart%20TVs%2C%20the%20eye%20tracker%20must%20operate%20in%20more%20challenging%20environments%20such%20as%20the%20home%20living%20room.%20In%20this%20paper%20we%20present%20a%20non-contact%20eye%20tracking%20system%20that%20allows%20for%20freedom%20of%20viewer%20motion%20in%20a%20living%20room%20environment.%20A%20pan%20and%20tilt%20mechanism%20is%20used%20to%20orient%20the%20eye%20tracker%2C%20guided%20by%20face%20tracking%20information%20from%20a%20wide-angle%20camera.%20The%20estimated%20point%20of%20gaze%20is%20corrected%20for%20viewer%20movement%20in%20realtime%2C%20avoiding%20the%20need%20for%20recalibration.%20The%20proposed%20technique%20achieves%20comparable%20accuracy%20to%20desktop%20systems%20near%20the%20calibration%20position%20of%20less%20than%201%5Cu00b0%20of%20visual%20angle%20and%20accuracy%20of%20less%20than%202%5Cu00b0%20of%20visual%20angle%20when%20the%20viewer%20moved%20a%20large%20distance%2C%20such%20as%20standing%20or%20sitting%20on%20the%20other%20side%20of%20the%20couch.%20The%20system%20performance%20achieved%20was%20more%20than%20sufficient%20to%20operate%20a%20novel%2C%20hands-free%20Smart%20TV%20interface.%22%2C%22date%22%3A%222012%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F2168556.2168608%22%2C%22ISBN%22%3A%229781450312219%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fdoi.acm.org%5C%2F10.1145%5C%2F2168556.2168608%22%2C%22collections%22%3A%5B%22V6KKNXIH%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A06%3A28Z%22%7D%7D%2C%7B%22key%22%3A%22RTV3T7RR%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Lawrence%20et%20al.%22%2C%22parsedDate%22%3A%222011-08-10%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ELawrence%2C%20P.%20D.%2C%20Hennessey%2C%20C.%20A.%2C%20Calvi%26%23xF1%3Bo-Fraga%2C%20J.%2C%20Ivanov%2C%20A.%2C%20Pulfrey%2C%20D.%20L.%2C%20Salcudean%2C%20S.%20E.%2C%20Yedlin%2C%20M.%2C%20Davies%2C%20M.%20S.%2C%20Chrostowski%2C%20L.%2C%20Madden%2C%20J.%2C%20Mirabbasi%2C%20S.%2C%20%26amp%3B%20Walus%2C%20K.%20%282011%29.%20COMPARING%20STUDENT%20ASSESSED%20COMPETENCIES%20IN%20PBL%20AND%20TRADITIONAL%20ECE%20PROGRAMS.%20%3Ci%3EProceedings%20of%20the%20Canadian%20Engineering%20Education%20Association%3C%5C%2Fi%3E%2C%20%3Ci%3E0%3C%5C%2Fi%3E%280%29.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fojs.library.queensu.ca%5C%2Findex.php%5C%2FPCEEA%5C%2Farticle%5C%2Fview%5C%2F3817%27%3Ehttp%3A%5C%2F%5C%2Fojs.library.queensu.ca%5C%2Findex.php%5C%2FPCEEA%5C%2Farticle%5C%2Fview%5C%2F3817%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22COMPARING%20STUDENT%20ASSESSED%20COMPETENCIES%20IN%20PBL%20AND%20TRADITIONAL%20ECE%20PROGRAMS%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22P.%20D.%22%2C%22lastName%22%3A%22Lawrence%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22C.%20A.%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J.%22%2C%22lastName%22%3A%22Calvi%5Cu00f1o-Fraga%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22A.%22%2C%22lastName%22%3A%22Ivanov%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22D.%20L.%22%2C%22lastName%22%3A%22Pulfrey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22S.%20E.%22%2C%22lastName%22%3A%22Salcudean%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M.%22%2C%22lastName%22%3A%22Yedlin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M.%20S.%22%2C%22lastName%22%3A%22Davies%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22L.%22%2C%22lastName%22%3A%22Chrostowski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J.%22%2C%22lastName%22%3A%22Madden%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22S.%22%2C%22lastName%22%3A%22Mirabbasi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22K.%22%2C%22lastName%22%3A%22Walus%22%7D%5D%2C%22abstractNote%22%3A%22Since%201999%2C%20both%20traditional%20and%20project%20based%20learning%20%28PBL%29%20programs%20have%20been%20offered%20to%20second%20year%20ECE%20students%20at%20the%20University%20of%20British%20Columbia.%20This%20paper%20describes%20the%20structure%20of%20the%20PBL%20program%20and%20a%20student%20evaluation%20of%20their%20competencies.%20The%20evaluation%20was%20carried%20out%20by%20a%20survey%20of%20third%20and%20fourth%20year%20students%20who%20passed%20through%20either%20the%20PBL%20or%20the%20traditional%20program.%20Results%20of%20the%20evaluation%20indicate%20that%20the%20former%20PBL%20students%20felt%20significantly%20more%20improved%20than%20the%20traditional%20students%20in%20their%20design-related%20abilities%20and%20some%20of%20their%20teamwork%20abilities%20evaluated%20by%20the%20survey.%22%2C%22date%22%3A%222011%5C%2F08%5C%2F10%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fojs.library.queensu.ca%5C%2Findex.php%5C%2FPCEEA%5C%2Farticle%5C%2Fview%5C%2F3817%22%2C%22collections%22%3A%5B%22PZW7FIZK%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A13%3A11Z%22%7D%7D%2C%7B%22key%22%3A%22XRR37N8C%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%22%2C%22parsedDate%22%3A%222010-11-25%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%20A.%20%282010%29.%20%3Ci%3EMethod%20for%20Automatic%20Mapping%20of%20Eye%20Tracker%20Data%20to%20Hypermedia%20Content%3C%5C%2Fi%3E%20%28Patent%20No.%20US20100295774%20A1%29.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20100295774%27%3Ehttp%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20100295774%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22patent%22%2C%22title%22%3A%22Method%20for%20Automatic%20Mapping%20of%20Eye%20Tracker%20Data%20to%20Hypermedia%20Content%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22inventor%22%2C%22firstName%22%3A%22Craig%20Adam%22%2C%22lastName%22%3A%22Hennessey%22%7D%5D%2C%22abstractNote%22%3A%22A%20system%20for%20automatic%20mapping%20of%20eye-gaze%20data%20to%20hypermedia%20content%20utilizes%20high-level%20content-of-interest%20tags%20to%20identify%20regions%20of%20content-of-interest%20in%20hypermedia%20pages.%20User%27s%20computers%20are%20equipped%20with%20eye-gaze%20tracker%20equipment%20that%20is%20capable%20of%20determining%20the%20user%27s%20point-of-gaze%20on%20a%20displayed%20hypermedia%20page.%20A%20content%20tracker%20identifies%20the%20location%20of%20the%20content%20using%20the%20content-of-interest%20tags%20and%20a%20point-of-gaze%20to%20content-of-interest%20linker%20directly%20maps%20the%20user%27s%20point-of-gaze%20to%20the%20displayed%20content-of-interest.%20A%20visible-browser-identifier%20determines%20which%20browser%20window%20is%20being%20displayed%20and%20identifies%20which%20portions%20of%20the%20page%20are%20being%20displayed.%20Test%20data%20from%20plural%20users%20viewing%20test%20pages%20is%20collected%2C%20analyzed%20and%20reported.%22%2C%22country%22%3A%22United%20States%22%2C%22assignee%22%3A%22Mirametrix%20Research%20Incorporated%22%2C%22issuingAuthority%22%3A%22%22%2C%22patentNumber%22%3A%22US20100295774%20A1%22%2C%22filingDate%22%3A%22Mar%2019%2C%202010%22%2C%22applicationNumber%22%3A%22%22%2C%22priorityNumbers%22%3A%22%22%2C%22issueDate%22%3A%22Nov%2025%2C%202010%22%2C%22references%22%3A%22%22%2C%22legalStatus%22%3A%22%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwww.google.com%5C%2Fpatents%5C%2FUS20100295774%22%2C%22collections%22%3A%5B%22KAZ2KUXT%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A06%3A44Z%22%7D%7D%2C%7B%22key%22%3A%22N5RGJRSM%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chen%20and%20Hennessey%22%2C%22parsedDate%22%3A%222010%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EChen%2C%20C.%2C%20%26amp%3B%20Hennessey%2C%20C.%20%282010%29.%20Online%20Eye-Gaze%20Usability%20Evaluation%20of%20Gmail%3B%20Are%20Mobile%20Interfaces%20Easier%20to%20Use%20with%20Eye-Trackers%3F%20%3Ci%3EProceedings%20of%20the%2033rd%20Conference%20of%20the%20Canadian%20Medical%20and%20Biological%20Engineering%20Society%3C%5C%2Fi%3E.%20%3Ca%20href%3D%27http%3A%5C%2F%5C%2Fscholar.google.com%5C%2Fscholar%3Fcluster%3D17698998794630431614%26hl%3Den%26oi%3Dscholarr%27%3Ehttp%3A%5C%2F%5C%2Fscholar.google.com%5C%2Fscholar%3Fcluster%3D17698998794630431614%26hl%3Den%26oi%3Dscholarr%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Online%20Eye-Gaze%20Usability%20Evaluation%20of%20Gmail%3B%20Are%20Mobile%20Interfaces%20Easier%20to%20Use%20with%20Eye-Trackers%3F%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22C.%22%2C%22lastName%22%3A%22Chen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Craig%22%2C%22lastName%22%3A%22Hennessey%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222010%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%2033rd%20Conference%20of%20the%20Canadian%20Medical%20and%20Biological%20Engineering%20Society%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fscholar.google.com%5C%2Fscholar%3Fcluster%3D17698998794630431614%26hl%3Den%26oi%3Dscholarr%22%2C%22collections%22%3A%5B%22V6KKNXIH%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A12%3A51Z%22%7D%7D%2C%7B%22key%22%3A%22HHVAWFFN%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%20and%20Duchowski%22%2C%22parsedDate%22%3A%222010%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%2C%20%26amp%3B%20Duchowski%2C%20A.%20T.%20%282010%29.%20An%20Open%20Source%20Eye-gaze%20Interface%3A%20Expanding%20the%20Adoption%20of%20Eye-gaze%20in%20Everyday%20Applications.%20%3Ci%3EProceedings%20of%20the%202010%20Symposium%20on%20Eye-Tracking%20Research%20%26amp%3B%20Applications%3C%5C%2Fi%3E%2C%2081%26%23×2013%3B84.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F1743666.1743686%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F1743666.1743686%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22An%20Open%20Source%20Eye-gaze%20Interface%3A%20Expanding%20the%20Adoption%20of%20Eye-gaze%20in%20Everyday%20Applications%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Craig%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andrew%20T.%22%2C%22lastName%22%3A%22Duchowski%22%7D%5D%2C%22abstractNote%22%3A%22There%20is%20no%20standard%20software%20interface%20in%20the%20eye-tracking%20industry%2C%20making%20it%20difficult%20for%20developers%20to%20integrate%20eye-gaze%20into%20their%20applications.%20The%20combination%20of%20high%20cost%20eye-trackers%20and%20lack%20of%20applications%20has%20resulted%20in%20a%20slow%20adoption%20of%20the%20technology.%20To%20expand%20the%20adoption%20of%20eye-gaze%20in%20everyday%20applications%2C%20we%20present%20an%20eye-gaze%20specific%20application%20programming%20interface%20that%20is%20platform%20and%20language%20neutral%2C%20based%20on%20open%20standards%2C%20easily%20used%20and%20extended%20and%20free%20of%20cost.%22%2C%22date%22%3A%222010%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%202010%20Symposium%20on%20Eye-Tracking%20Research%20%26%20Applications%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F1743666.1743686%22%2C%22ISBN%22%3A%229781605589947%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fdoi.acm.org%5C%2F10.1145%5C%2F1743666.1743686%22%2C%22collections%22%3A%5B%22V6KKNXIH%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A07%3A50Z%22%7D%7D%2C%7B%22key%22%3A%228W8TQ3G2%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%2A%20and%20Lawrence%22%2C%22parsedDate%22%3A%222009-07%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2A%2C%20C.%20A.%2C%20%26amp%3B%20Lawrence%2C%20P.%20D.%20%282009%29.%20Improving%20the%20Accuracy%20and%20Reliability%20of%20Remote%20System-Calibration-Free%20Eye-Gaze%20Tracking.%20%3Ci%3EIEEE%20Transactions%20on%20Biomedical%20Engineering%3C%5C%2Fi%3E%2C%20%3Ci%3E56%3C%5C%2Fi%3E%287%29%2C%201891%26%23×2013%3B1900.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FTBME.2009.2015955%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FTBME.2009.2015955%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Improving%20the%20Accuracy%20and%20Reliability%20of%20Remote%20System-Calibration-Free%20Eye-Gaze%20Tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22C.%20A.%22%2C%22lastName%22%3A%22Hennessey%2A%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22P.%20D.%22%2C%22lastName%22%3A%22Lawrence%22%7D%5D%2C%22abstractNote%22%3A%22Remote%20eye-gaze%20tracking%20provides%20a%20means%20for%20nonintrusive%20tracking%20of%20the%20point-of-gaze%20%28POG%29%20of%20a%20user.%20For%20application%20as%20a%20user%20interface%20for%20the%20disabled%2C%20a%20remote%20system%20that%20is%20noncontact%2C%20reliable%2C%20and%20permits%20head%20motion%20is%20very%20desirable.%20The%20system-calibration-free%20pupil-corneal%20reflection%20%28P-CR%29%20vector%20technique%20for%20POG%20estimation%20is%20a%20popular%20method%20due%20to%20its%20simplicity%2C%20however%2C%20accuracy%20has%20been%20shown%20to%20be%20degraded%20with%20head%20displacement.%20Model-based%20POG-estimation%20methods%20were%20developed%2C%20which%20improve%20system%20accuracy%20during%20head%20displacement%2C%20however%2C%20these%20methods%20require%20complex%20system%20calibration%20in%20addition%20to%20user%20calibration.%20In%20this%20paper%2C%20the%20use%20of%20multiple%20corneal%20reflections%20and%20point-pattern%20matching%20allows%20for%20a%20scaling%20correction%20of%20the%20P-CR%20vector%20for%20head%20displacements%20as%20well%20as%20an%20improvement%20in%20system%20robustness%20to%20corneal%20reflection%20distortion%2C%20leading%20to%20improved%20POG-estimation%20accuracy.%20To%20demonstrate%20the%20improvement%20in%20performance%2C%20the%20enhanced%20multiple%20corneal%20reflection%20P-CR%20method%20is%20compared%20to%20the%20monocular%20and%20binocular%20accuracy%20of%20the%20traditional%20single%20corneal%20reflection%20P-CR%20method%2C%20and%20a%20model-based%20method%20of%20POG%20estimation%20for%20various%20head%20displacements.%22%2C%22date%22%3A%22July%202009%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FTBME.2009.2015955%22%2C%22ISSN%22%3A%220018-9294%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22PZW7FIZK%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A05%3A21Z%22%7D%7D%2C%7B%22key%22%3A%2279QD2W3B%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%2A%20and%20Lawrence%22%2C%22parsedDate%22%3A%222009-03%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2A%2C%20C.%2C%20%26amp%3B%20Lawrence%2C%20P.%20%282009%29.%20Noncontact%20Binocular%20Eye-Gaze%20Tracking%20for%20Point-of-Gaze%20Estimation%20in%20Three%20Dimensions.%20%3Ci%3EIEEE%20Transactions%20on%20Biomedical%20Engineering%3C%5C%2Fi%3E%2C%20%3Ci%3E56%3C%5C%2Fi%3E%283%29%2C%20790%26%23×2013%3B799.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FTBME.2008.2005943%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FTBME.2008.2005943%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Noncontact%20Binocular%20Eye-Gaze%20Tracking%20for%20Point-of-Gaze%20Estimation%20in%20Three%20Dimensions%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22C.%22%2C%22lastName%22%3A%22Hennessey%2A%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22P.%22%2C%22lastName%22%3A%22Lawrence%22%7D%5D%2C%22abstractNote%22%3A%22Binocular%20eye-gaze%20tracking%20can%20be%20used%20to%20estimate%20the%20point-of-gaze%20%28POG%29%20of%20a%20subject%20in%20real-world%203D%20space%20using%20the%20vergence%20of%20the%20eyes.%20In%20this%20paper%2C%20a%20novel%20noncontact%20model-based%20technique%20for%203D%20POG%20estimation%20is%20presented.%20The%20noncontact%20system%20allows%20people%20to%20select%20real-world%20objects%20in%203D%20physical%20space%20using%20their%20eyes%2C%20without%20the%20need%20for%20head-mounted%20equipment.%20Remote%203D%20POG%20estimation%20may%20be%20especially%20useful%20for%20persons%20with%20quadriplegia%20or%20Amyotrophic%20Lateral%20Sclerosis.%20It%20would%20also%20enable%20a%20user%20to%20select%203D%20points%20in%20space%20generated%20by%203D%20volumetric%20displays%2C%20with%20potential%20applications%20to%20medical%20imaging%20and%20telesurgery.%20Using%20a%20model-based%20POG%20estimation%20algorithm%20allows%20for%20free%20head%20motion%20and%20a%20single%20stage%20of%20calibration.%20It%20is%20shown%20that%20an%20average%20accuracy%20of%203.93%20cm%20was%20achieved%20over%20a%20workspace%20volume%20of%2030%20times%2023%20times%2025%20cm%20%28W%20times%20H%20times%20D%29%20with%20a%20maximum%20latency%20of%201.5%20s%20due%20to%20the%20digital%20filtering%20employed.%20The%20users%20were%20free%20to%20naturally%20move%20and%20reorient%20their%20heads%20while%20operating%20the%20system%2C%20within%20an%20allowable%20headspace%20of%203%20cm%20times%209%20cm%20times%2014%20cm.%22%2C%22date%22%3A%22March%202009%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FTBME.2008.2005943%22%2C%22ISSN%22%3A%220018-9294%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22PZW7FIZK%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A05%3A35Z%22%7D%7D%2C%7B%22key%22%3A%22KCT88A4Z%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%20et%20al.%22%2C%22parsedDate%22%3A%222008-04%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%2C%20Noureddin%2C%20B.%2C%20%26amp%3B%20Lawrence%2C%20P.%20%282008%29.%20Fixation%20Precision%20in%20High-Speed%20Noncontact%20Eye-Gaze%20Tracking.%20%3Ci%3EIEEE%20Transactions%20on%20Systems%2C%20Man%2C%20and%20Cybernetics%2C%20Part%20B%20%28Cybernetics%29%3C%5C%2Fi%3E%2C%20%3Ci%3E38%3C%5C%2Fi%3E%282%29%2C%20289%26%23×2013%3B298.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FTSMCB.2007.911378%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FTSMCB.2007.911378%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Fixation%20Precision%20in%20High-Speed%20Noncontact%20Eye-Gaze%20Tracking%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22C.%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22B.%22%2C%22lastName%22%3A%22Noureddin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22P.%22%2C%22lastName%22%3A%22Lawrence%22%7D%5D%2C%22abstractNote%22%3A%22The%20precision%20of%20point-of-gaze%20%28POG%29%20estimation%20during%20a%20fixation%20is%20an%20important%20factor%20in%20determining%20the%20usability%20of%20a%20noncontact%20eye-gaze%20tracking%20system%20for%20real-time%20applications.%20The%20objective%20of%20this%20paper%20is%20to%20define%20and%20measure%20POG%20fixation%20precision%2C%20propose%20methods%20for%20increasing%20the%20fixation%20precision%2C%20and%20examine%20the%20improvements%20when%20the%20methods%20are%20applied%20to%20two%20POG%20estimation%20approaches.%20To%20achieve%20these%20objectives%2C%20techniques%20for%20high-speed%20image%20processing%20that%20allow%20POG%20sampling%20rates%20of%20over%20400%20Hz%20are%20presented.%20With%20these%20high-speed%20POG%20sampling%20rates%2C%20the%20fixation%20precision%20can%20be%20improved%20by%20filtering%20while%20maintaining%20an%20acceptable%20real-time%20latency.%20The%20high-speed%20sampling%20and%20digital%20filtering%20techniques%20developed%20were%20applied%20to%20two%20POG%20estimation%20techniques%2C%20i.e.%2C%20the%20highspeed%20pupil-corneal%20reflection%20%28HS%20P-CR%29%20vector%20method%20and%20a%203-D%20model-based%20method%20allowing%20free%20head%20motion.%20Evaluation%20on%20the%20subjects%20has%20shown%20that%20when%20operating%20at%20407%20frames%20per%20second%20%28fps%29%20with%20filtering%2C%20the%20fixation%20precision%20for%20the%20HS%20P-CR%20POG%20estimation%20method%20was%20improved%20by%20a%20factor%20of%205.8%20to%200.035deg%20%281.6%20screen%20pixels%29%20compared%20to%20the%20unfiltered%20operation%20at%2030%20fps.%20For%20the%203-D%20POG%20estimation%20method%2C%20the%20fixation%20precision%20was%20improved%20by%20a%20factor%20of%2011%20to%200.050deg%20%282.3%20screen%20pixels%29%20compared%20to%20the%20unfiltered%20operation%20at%2030%20fps.%22%2C%22date%22%3A%22April%202008%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FTSMCB.2007.911378%22%2C%22ISSN%22%3A%221083-4419%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22PZW7FIZK%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A05%3A48Z%22%7D%7D%2C%7B%22key%22%3A%2265MFZUUM%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%20and%20Lawrence%22%2C%22parsedDate%22%3A%222008%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%2C%20%26amp%3B%20Lawrence%2C%20P.%20%282008%29.%203D%20Point-of-gaze%20Estimation%20on%20a%20Volumetric%20Display.%20%3Ci%3EProceedings%20of%20the%202008%20Symposium%20on%20Eye%20Tracking%20Research%20%26amp%3B%20Applications%3C%5C%2Fi%3E%2C%2059%26%23×2013%3B59.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F1344471.1344486%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F1344471.1344486%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%223D%20Point-of-gaze%20Estimation%20on%20a%20Volumetric%20Display%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Craig%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Lawrence%22%7D%5D%2C%22abstractNote%22%3A%22Eye-gaze%20tracking%20devices%20are%20typically%20used%20to%20estimate%20the%20point-of-gaze%20%28POG%29%20of%20a%20subject%20on%20a%202D%20surface%20such%20as%20a%20computer%20screen.%20Using%20model%20based%20methods%20for%20POG%20estimation%20we%20have%20developed%20a%20system%20based%20on%20the%20vergence%20of%20the%20eyes%20which%20can%20be%20used%20to%20estimate%20the%20POG%20on%20a%20real-world%20volumetric%20display.%22%2C%22date%22%3A%222008%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%202008%20Symposium%20on%20Eye%20Tracking%20Research%20%26%20Applications%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F1344471.1344486%22%2C%22ISBN%22%3A%229781595939821%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fdoi.acm.org%5C%2F10.1145%5C%2F1344471.1344486%22%2C%22collections%22%3A%5B%22V6KKNXIH%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A07%3A02Z%22%7D%7D%2C%7B%22key%22%3A%222Z876PUM%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%22%2C%22parsedDate%22%3A%222008%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%20%282008%29.%20%3Ci%3EPoint-of-gaze%20estimation%20in%20three%20dimensions%3C%5C%2Fi%3E%20%5BUniversity%20of%20British%20Columbia%5D.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fopen.library.ubc.ca%5C%2Fcollections%5C%2Fubctheses%5C%2F24%5C%2Fitems%5C%2F1.0066824%27%3Ehttps%3A%5C%2F%5C%2Fopen.library.ubc.ca%5C%2Fcollections%5C%2Fubctheses%5C%2F24%5C%2Fitems%5C%2F1.0066824%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22thesis%22%2C%22title%22%3A%22Point-of-gaze%20estimation%20in%20three%20dimensions%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Craig%22%2C%22lastName%22%3A%22Hennessey%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22thesisType%22%3A%22%22%2C%22university%22%3A%22University%20of%20British%20Columbia%22%2C%22date%22%3A%222008%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fopen.library.ubc.ca%5C%2Fcollections%5C%2Fubctheses%5C%2F24%5C%2Fitems%5C%2F1.0066824%22%2C%22collections%22%3A%5B%22BD6CHI6C%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A11%3A53Z%22%7D%7D%2C%7B%22key%22%3A%22CD386K4P%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%20et%20al.%22%2C%22parsedDate%22%3A%222006%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%2C%20Noureddin%2C%20B.%2C%20%26amp%3B%20Lawrence%2C%20P.%20%282006%29.%20A%20Single%20Camera%20Eye-gaze%20Tracking%20System%20with%20Free%20Head%20Motion.%20%3Ci%3EProceedings%20of%20the%202006%20Symposium%20on%20Eye%20Tracking%20Research%20%26amp%3B%20Applications%3C%5C%2Fi%3E%2C%2087%26%23×2013%3B94.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F1117309.1117349%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F1117309.1117349%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22A%20Single%20Camera%20Eye-gaze%20Tracking%20System%20with%20Free%20Head%20Motion%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Craig%22%2C%22lastName%22%3A%22Hennessey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Borna%22%2C%22lastName%22%3A%22Noureddin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Lawrence%22%7D%5D%2C%22abstractNote%22%3A%22Eye-gaze%20as%20a%20form%20of%20human%20machine%20interface%20holds%20great%20promise%20for%20improving%20the%20way%20we%20interact%20with%20machines.%20Eye-gaze%20tracking%20devices%20that%20are%20non-contact%2C%20non-restrictive%2C%20accurate%20and%20easy%20to%20use%20will%20increase%20the%20appeal%20for%20including%20eye-gaze%20information%20in%20future%20applications.%20The%20system%20we%20have%20developed%20and%20which%20we%20describe%20in%20this%20paper%20achieves%20these%20goals%20using%20a%20single%20high%20resolution%20camera%20with%20a%20fixed%20field%20of%20view.%20The%20single%20camera%20system%20has%20no%20moving%20parts%20which%20results%20in%20rapid%20reacquisition%20of%20the%20eye%20after%20loss%20of%20tracking.%20Free%20head%20motion%20is%20achieved%20using%20multiple%20glints%20and%203D%20modeling%20techniques.%20Accuracies%20of%20under%201%5Cu00b0%20of%20visual%20angle%20are%20achieved%20over%20a%20field%20of%20view%20of%2014x12x20%20cm%20and%20over%20various%20hardware%20configurations%2C%20camera%20resolutions%20and%20frame%20rates.%22%2C%22date%22%3A%222006%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%202006%20Symposium%20on%20Eye%20Tracking%20Research%20%26%20Applications%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F1117309.1117349%22%2C%22ISBN%22%3A%229781595933058%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fdoi.acm.org%5C%2F10.1145%5C%2F1117309.1117349%22%2C%22collections%22%3A%5B%22V6KKNXIH%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A04%3A46Z%22%7D%7D%2C%7B%22key%22%3A%22FIDF9D74%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%22%2C%22parsedDate%22%3A%222005%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%20%282005%29.%20%3Ci%3EEye-gaze%20tracking%20with%20free%20head%20motion%3C%5C%2Fi%3E%20%5BUniversity%20of%20British%20Columbia%5D.%20%3Ca%20href%3D%27https%3A%5C%2F%5C%2Fopen.library.ubc.ca%5C%2Fcollections%5C%2Fubctheses%5C%2F831%5C%2Fitems%5C%2F1.0064994%27%3Ehttps%3A%5C%2F%5C%2Fopen.library.ubc.ca%5C%2Fcollections%5C%2Fubctheses%5C%2F831%5C%2Fitems%5C%2F1.0064994%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22thesis%22%2C%22title%22%3A%22Eye-gaze%20tracking%20with%20free%20head%20motion%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Craig%22%2C%22lastName%22%3A%22Hennessey%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22thesisType%22%3A%22%22%2C%22university%22%3A%22University%20of%20British%20Columbia%22%2C%22date%22%3A%222005%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fopen.library.ubc.ca%5C%2Fcollections%5C%2Fubctheses%5C%2F831%5C%2Fitems%5C%2F1.0064994%22%2C%22collections%22%3A%5B%22BD6CHI6C%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A42%3A59Z%22%7D%7D%2C%7B%22key%22%3A%22EQSDBCDC%22%2C%22library%22%3A%7B%22id%22%3A448624%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hennessey%22%2C%22parsedDate%22%3A%222000%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHennessey%2C%20C.%20%282000%29.%20Autonomous%20control%20of%20a%20scale%20airplane.%20%3Ci%3EBachelors%20of%20Engineering%20Science%20Thesis%2C%20Simon%20Fraser%20University%3C%5C%2Fi%3E.%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Autonomous%20control%20of%20a%20scale%20airplane%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Craig%22%2C%22lastName%22%3A%22Hennessey%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222000%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22PZW7FIZK%22%5D%2C%22dateModified%22%3A%222016-08-21T19%3A10%3A07Z%22%7D%7D%5D%7D
Naicker, P., Anoopkumar-Dukie, S., Grant, G. D., & Kavanagh, J. J. (2016). Medications influencing central cholinergic neurotransmission affect saccadic and smooth pursuit eye movements in healthy young adults.
Psychopharmacology.
https://doi.org/10.1007/s00213-016-4436-1
Tong, I., Mohareri, O., Tatasurya, S., Hennessey, C., & Salcudean, S. (2015). A retrofit eye gaze tracker for the da Vinci and its integration in task execution using the da Vinci Research Kit.
2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2043–2050.
https://doi.org/10.1109/IROS.2015.7353648
Robillard, J. M., Cabral, E., Hennessey, C., Kwon, B. K., & Illes, J. (2015). Fueling Hope: Stem Cells in Social Media.
Stem Cell Reviews and Reports,
11(4), 540–546.
https://doi.org/10.1007/s12015-015-9591-y
Hennessy, C. A. (2014).
System and method for analyzing three-dimensional (3D) media content (Patent No. US8913790 B2).
http://www.google.com/patents/US8913790
Hennessey, C. A., Fiset, J., & Sullivan, N. (2014).
System and Method For Calibrating Eye Gaze Data (Patent No. US20140320397 A1).
http://www.google.com/patents/US20140320397
Hennessey, C. A., Fiset, J., & ST-HILAIRE, S. (2014).
System and Method for Using Eye Gaze Information to Enhance Interactions (Patent No. US20140184550 A1).
http://www.google.com/patents/US20140184550
Hennessey, C. A., & Fiset, J. (2013).
System and Method for Interacting with and Analyzing Media on a Display Using Eye Gaze Tracking (Patent No. US20130235347 A1).
http://www.google.com/patents/US20130235347
Robillard, J. M., Johnson, T. W., Hennessey, C., Beattie, B. L., & Illes, J. (2013). Aging 2.0: Health Information about Dementia on Twitter.
PLOS ONE,
8(7), e69861.
https://doi.org/10.1371/journal.pone.0069861
Hennessey, C. A., & Lawrence, P. D. (2013).
Methods and apparatus for estimating point-of-gaze in three dimensions (Patent No. US8457352 B2).
http://www.google.com/patents/US8457352
Hennessey, C., & Fiset, J. (2012). Long Range Eye Tracking: Bringing Eye Tracking into the Living Room.
Proceedings of the Symposium on Eye Tracking Research and Applications, 249–252.
https://doi.org/10.1145/2168556.2168608
Lawrence, P. D., Hennessey, C. A., Calviño-Fraga, J., Ivanov, A., Pulfrey, D. L., Salcudean, S. E., Yedlin, M., Davies, M. S., Chrostowski, L., Madden, J., Mirabbasi, S., & Walus, K. (2011). COMPARING STUDENT ASSESSED COMPETENCIES IN PBL AND TRADITIONAL ECE PROGRAMS.
Proceedings of the Canadian Engineering Education Association,
0(0).
http://ojs.library.queensu.ca/index.php/PCEEA/article/view/3817
Hennessey, C. A. (2010).
Method for Automatic Mapping of Eye Tracker Data to Hypermedia Content (Patent No. US20100295774 A1).
http://www.google.com/patents/US20100295774
Chen, C., & Hennessey, C. (2010). Online Eye-Gaze Usability Evaluation of Gmail; Are Mobile Interfaces Easier to Use with Eye-Trackers?
Proceedings of the 33rd Conference of the Canadian Medical and Biological Engineering Society.
http://scholar.google.com/scholar?cluster=17698998794630431614&hl=en&oi=scholarr
Hennessey, C., & Duchowski, A. T. (2010). An Open Source Eye-gaze Interface: Expanding the Adoption of Eye-gaze in Everyday Applications.
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, 81–84.
https://doi.org/10.1145/1743666.1743686
Hennessey*, C. A., & Lawrence, P. D. (2009). Improving the Accuracy and Reliability of Remote System-Calibration-Free Eye-Gaze Tracking.
IEEE Transactions on Biomedical Engineering,
56(7), 1891–1900.
https://doi.org/10.1109/TBME.2009.2015955
Hennessey*, C., & Lawrence, P. (2009). Noncontact Binocular Eye-Gaze Tracking for Point-of-Gaze Estimation in Three Dimensions.
IEEE Transactions on Biomedical Engineering,
56(3), 790–799.
https://doi.org/10.1109/TBME.2008.2005943
Hennessey, C., Noureddin, B., & Lawrence, P. (2008). Fixation Precision in High-Speed Noncontact Eye-Gaze Tracking.
IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics),
38(2), 289–298.
https://doi.org/10.1109/TSMCB.2007.911378
Hennessey, C., & Lawrence, P. (2008). 3D Point-of-gaze Estimation on a Volumetric Display.
Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, 59–59.
https://doi.org/10.1145/1344471.1344486
Hennessey, C. (2008).
Point-of-gaze estimation in three dimensions [University of British Columbia].
https://open.library.ubc.ca/collections/ubctheses/24/items/1.0066824
Hennessey, C., Noureddin, B., & Lawrence, P. (2006). A Single Camera Eye-gaze Tracking System with Free Head Motion.
Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, 87–94.
https://doi.org/10.1145/1117309.1117349
Hennessey, C. (2005).
Eye-gaze tracking with free head motion [University of British Columbia].
https://open.library.ubc.ca/collections/ubctheses/831/items/1.0064994
Hennessey, C. (2000). Autonomous control of a scale airplane. Bachelors of Engineering Science Thesis, Simon Fraser University.
Direct Download Links
An Open Source Eye-gaze Interface: Expanding the Adoption of Eye-gaze in Everyday Applications
Craig Hennessey and Andrew T Duchowski
Proceedings of the 2010 Symposium on Eye Tracking Research & Applications, pp. 81-84, March 2010. Conference Paper (PDF)
Improving the Accuracy and Reliability of Remote System-Calibration-Free Eye-gaze Tracking
Craig Hennessey and Peter Lawrence, IEEE Transactions on Biomedical Engineering. Vol 56, no. 6, pp -, June 2009. Full Paper (PDF)
Non-Contact Binocular Eye-Gaze Tracking for Point-of-Gaze Estimation in Three Dimensions.
Craig Hennessey and Peter Lawrence, IEEE Transactions on Biomedical Engineering. Vol. 56, no. 3, pp. 790-799, March 2009. Full Paper (PDF)
Point-of-Gaze Estimation in Three Dimensions.
Craig Hennessey, Doctor of Philosophy, Electrical and Computer Engineering, University of British Columbia, 2008. Thesis (PDF)
Fixation Precision in High-Speed Noncontact Eye-Gaze Tracking
Craig Hennessey, Borna Noureddin and Peter Lawrence, IEEE Transactions on Systems, Man and Cybernetics – Part B. Vol. 38, no. 2, pp. 289–298, April 2008. Full Paper (PDF)
3D Point-of-Gaze Estimation on a Volumetric Display.
Craig Hennessey and Peter Lawrence, In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, pp. 59-59, 2008. Conference Paper (PDF)
A single camera eye-gaze tracking system with free head motion.
Craig Hennessey, Borna Noureddin and Peter Lawrence, In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, pp. 87-94, 2006. Conference Paper (PDF)