WO1997004394A1 - Computer software authentication, protection, and security system - Google Patents
Computer software authentication, protection, and security system Download PDFInfo
- Publication number
- WO1997004394A1 WO1997004394A1 PCT/AU1996/000440 AU9600440W WO9704394A1 WO 1997004394 A1 WO1997004394 A1 WO 1997004394A1 AU 9600440 W AU9600440 W AU 9600440W WO 9704394 A1 WO9704394 A1 WO 9704394A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- code
- software
- executable
- wherem
- entry process
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 164
- 230000008569 process Effects 0.000 claims abstract description 71
- 238000012544 monitoring process Methods 0.000 claims abstract description 7
- 238000012795 verification Methods 0.000 claims abstract description 4
- 230000004048 modification Effects 0.000 claims description 11
- 238000012986 modification Methods 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 10
- 230000000007 visual effect Effects 0.000 claims description 9
- 230000000694 effects Effects 0.000 claims description 8
- 230000006870 function Effects 0.000 claims description 7
- 239000000872 buffer Substances 0.000 claims description 6
- 230000002441 reversible effect Effects 0.000 claims description 6
- 230000004075 alteration Effects 0.000 claims description 5
- 230000009466 transformation Effects 0.000 claims description 5
- 241000282414 Homo sapiens Species 0.000 claims description 4
- 230000004913 activation Effects 0.000 claims description 4
- 239000003086 colorant Substances 0.000 claims description 3
- 230000004044 response Effects 0.000 claims description 3
- 238000012360 testing method Methods 0.000 claims description 3
- 230000003213 activating effect Effects 0.000 claims description 2
- 230000006835 compression Effects 0.000 claims description 2
- 238000007906 compression Methods 0.000 claims description 2
- 238000013461 design Methods 0.000 claims description 2
- 238000003780 insertion Methods 0.000 claims description 2
- 230000037431 insertion Effects 0.000 claims description 2
- 239000011295 pitch Substances 0.000 claims description 2
- 238000011084 recovery Methods 0.000 claims description 2
- 238000009877 rendering Methods 0.000 claims description 2
- 238000001994 activation Methods 0.000 claims 2
- 230000001133 acceleration Effects 0.000 claims 1
- 238000007689 inspection Methods 0.000 claims 1
- 238000004519 manufacturing process Methods 0.000 claims 1
- 230000000873 masking effect Effects 0.000 claims 1
- 230000010076 replication Effects 0.000 claims 1
- 239000000725 suspension Substances 0.000 claims 1
- 241000700605 Viruses Species 0.000 abstract description 5
- ZXQYGBMAQZUVMI-GCMPRSNUSA-N gamma-cyhalothrin Chemical compound CC1(C)[C@@H](\C=C(/Cl)C(F)(F)F)[C@H]1C(=O)O[C@H](C#N)C1=CC=CC(OC=2C=CC=CC=2)=C1 ZXQYGBMAQZUVMI-GCMPRSNUSA-N 0.000 abstract description 3
- 230000003993 interaction Effects 0.000 abstract description 3
- 230000026676 system process Effects 0.000 abstract description 3
- 230000002708 enhancing effect Effects 0.000 abstract description 2
- 210000001525 retina Anatomy 0.000 abstract description 2
- 238000013479 data entry Methods 0.000 description 17
- 238000004590 computer program Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 4
- 230000001010 compromised effect Effects 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 241000320126 Pseudomugilidae Species 0.000 description 1
- 230000002155 anti-virotic effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000001020 rhythmical effect Effects 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 238000010561 standard procedure Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000003612 virological effect Effects 0.000 description 1
- 230000003245 working effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/554—Detecting local intrusion or implementing counter-measures involving event detection and direct action
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/10—Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM]
- G06F21/12—Protecting executable software
- G06F21/14—Protecting executable software against software analysis or reverse engineering, e.g. by obfuscation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/52—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow
- G06F21/54—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow by adding security routines or objects to programs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/04—Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
- H04L63/0428—Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/14—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
- H04L63/1441—Countermeasures against malicious traffic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2207/00—Indexing scheme relating to methods or arrangements for processing data by operating upon the order or content of the data handled
- G06F2207/72—Indexing scheme relating to groups G06F7/72 - G06F7/729
- G06F2207/7219—Countermeasures against side channel or fault attacks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2149—Restricted operating environment
Definitions
- the present mvention relates to a computer program havmg enhanced security features, and also to a system and method for enhancing the security features of a computer program
- the present mvention relates to such a program, and the system and method for creatmg the program, havmg mcreased security features to prevent ID-Data (as defined hereafter) eavesdroppmg and/or theft and/or to ensure authenticity
- ID-Data will be used to refer to the abovementioned identification, authentication or similar data, excluding ID-Data which is valid only for a smgle use, or which is designed to expire at regular intervals of less than two mmutes
- Viruses, Termmate-and-stay-reside ⁇ t programs (TSRs), co-resident software, multithreaded operatmg system processes, Trojan Horses, Worms, Hackers, Spoof programs, key-press password capturers, macro-recorders, sniffers, and the like can be effective at stealing ID-Data and are examples of (a) rogue software or (b) people capable of subverting secu ⁇ ty software or (c) software which can be configured for illegitimate purposes.
- the term rogue software will be used to refer to software or subversions such as the abovementioned (a) (b) and (c), used for the purpose of stealing ID-Data.
- rogue software when used herein also includes software or other means used to tamper with other software.
- tampering is defined hereafter.
- rogue software There are many ways to introduce rogue software into a computer system. Viruses spread automatically by introducing themselves. Trojan-Horses are usually introduced by tricking users into allowing them to execute (such as by masquerading as a new or well-known computer game or other product).
- Rogue software once introduced, can steal ID-Data as mentioned hereinbefore. It may monitor keyboard (for example: by recording every key, as the user presses each one, in order to steal a password as it is being typed in), serial-port, mouse, screen, or other devices to steal ID-Data directly from them. It may monitor other software, applications, the operating system, or disks to steal ID- Data from there also. Once stolen, this ID-Data may be stored locally (for example: in memory or on- disk) or transmitted to remote locations (for example: by modem or network) or used immediately to perform illegal operations.
- eavesdropping will be used to refer to the monitoring of a computer to record ID-Data.
- a key press recorder could secretly, and unbeknown to the computer user, record all the keys pressed by the user into a hidden systems file.
- the information recorded could mclude a user's password and other sensitive information which an organisation would obviously wish to protect.
- rogue software may remove, disable, or compromise existing computer software security features by modifying the memory, disk, or other image of said computer software.
- Rogue software may also utilise tampering techniques to alter existing computer software in order to steal ID- Data from it, or may attach itself to existmg computer software (as is the case with many con ⁇ uter viruses).
- tampering will be used to refer to die abovementioned modification of computer software. Tampering may take place either locally (within a users PC) or remotely (for example: at one ofthe points which a computer program passes through as it is being download).
- counterfeit software can be substituted for legitimate software.
- the counterfeit will appear real to a computer user, but actuaUy acts to subvert security, such as by stealing ID-Data.
- Sometimes called "Spoof" programs or Trojan Horses, counterfeit software of this type may invoke the original legitimate software after having stolen ID-Data, so as not to arouse a users suspicion.
- This invention describes a process which substantially enhances the security of computer software (hereafter refe ⁇ ed to as the improved process) and a method by which to apply said improved process (hereafter referred to as the applicator).
- the improved process consists of including computer code to automatically detect tampering of said computer software, and computer code to prevent the theft of ID-Data by replacing existmg vulnerable (to rogue software eavesdropping or attack) software or operating system code with secure equivalents which utilise anti-spy techniques (as described later in this document).
- the improved process also consists of including computer code to prevent de- compilation, reverse-engineering, and disassembly by the inclusion of obfuscating code inserts, and the use of executable encryption.
- the improved process also consists of includmg code to prevent execution-tracing and debugging by the use of code designed to detect and prevent these operations.
- the improved process consists of, or also includes, human-recognisable audio-visual components which permit the authenticity of said computer software to be easily verified by the user on each invocation using techniques described later in this document.
- Fig.1 illustrates the standard operation of a computer system known in the prior art
- Fig.2 illustrates the known operation of a rogue or "spoof program
- Fig.3 illustrates apphcation code updated with the prefe ⁇ ed embodiment
- Fig.4 illustrates the known operation of a rogue eavesdropping program
- Fig.5 illustrates the interaction of the components ofthe updated application
- Fig.6 illustrates the general structure of the prefe ⁇ ed embodiment ofthe applicator
- Fig.7 illustrates a standard layout for a program to be executed on a computer system
- Fig.8 illustrates the standard layout of an EXE header under the MS-DOS operating system.
- Fig.9 illustrates a standard layout of an EXE program under MS-DOS
- Fig.10 illustrates an altered executable form constructed in accordance with the specific embodiment
- Fig.11 illustrates a first stage of execution ofthe new.exe executable
- Fig.12 illustrates a second stage of execution ofthe new.exe executable file
- Fig.13 illustrates a third stage of execution ofthe new.exe executable file.
- the present invention has general applicability to many different operating systems including Microsoft DOS (Trade Mark), Apple Macintosh Operating Svstem, Unix OTrade Mark) etc.
- Microsoft DOS Trade Mark
- Apple Macintosh Operating Svstem Unix OTrade Mark
- Security is provided by (a) hampering examination of software-code or operatmg system code or parts thereof through the use ofthe encryption or partial encryption of said code, (b) preventing the disassembly of said code mrough the inclusion of dummy instructions and prefixes and additional code to mislead and hamper disassembly (ie: obfuscating inserts), (c) preventing the computerised tracing of the execution of said code (for example: with code debugging tools) through the use of instructions to detect, mislead, and hamper tracing, (d) preventing tampering of said code through the use of scanning to locate alterations, either or both on-disk and in memory either once at the start of execution, or continuously upon certain events, or (e) preventing ID-Data theft through the inclusion of secure input output routines (for example: routines to bypass the standard operating system keyboard calls and use custom-written higher-security routines as
- Fig.1 there is illustrated the standard scenario for "running" a given executable program 16, under the control of a computer operating system 17 on a computer IS.
- die executable program 16 is subjected to modification, as will be described hereinafter, to ensure its integrity and improve its security.
- Aspect 1 Preventing eavesdropping.
- replacement routines may communicate directly with the hardware ofthe computer (for example, they may communicate with the keyboard circuitry instead of using the system-supplied (and hence possibly insecure) application interface keyboard-entry function-calls.) while disabling system interrupts which would permit rogue software to eavesdrop.
- Said replacement routines are coded to store ID-Data retrieved in a secure manner. ID-Data is not stored in full in plaintext (ie: unencrypted) in system or apphcation buffers.
- Aspect 2 Preventing disassembly and examination. As hereinbefore described, it is desirable to hamper disassembly (or de-compilation or reverse engineering) to protect software against eavesdroppmg and tampering, and to hinder examination of said software which might lead to secret security problems or mistakes being disclosed.
- Obfuscating inserts can successfully prevent automatic disassembly.
- Obfuscation is achieved by foUowing unconditional jump instructions (for example, Intel IMP or CLC/JNC combmation or CALL (without a retum expected) or any flow-of-control altering instruction which is known not to return to the usual place) with one or more dummy op-code bytes which wiU cause subsequent op ⁇ codes to be erroneously disassembled (for example, the Intel OxEA prefix wiU cause disassembly of the subsequent 4 op-codes to be inconect, displaying them as the offset to the JMP instruction indicated by the OxEA prefix instead ofthe instructions they actuaUy represent).
- unconditional jump instructions for example, Intel IMP or CLC/JNC combmation or CALL (without a retum expected) or any flow-of-control altering instruction which is known not to return to the usual place
- wiU cause subsequent op ⁇ codes to be
- Dummy instructions may also be mcluded to hamper disassembly by deliberately misleading a disassembler into believing a particular flow of control wiU occur, when in fact it wiU not.
- Flow of control can be designed to occur based upon CPU flag values determined from instructions executed a long time ago. Together with tracing prevention, this makes manual disassembly nearly impossible.
- the majority ofthe executable portions ofthe software can be encrypted for extemal storage.
- the decryption taking place in-memory after the software is loaded from extemal sources, under the control of a decryption 'header" which prevents its own tampering and disassembly etc. This makes manual and automatic disassembly nearly impossible, since the decryption should be designed to fail if tampering or tracing is detected.
- the software can scan the memory image of itself one or more times, or continuously, to ensure that unexpected alterations do not occur.
- Certain modifications to the extemal copy of software are reflected in subtle changes to the environment in which the modified software will be executed (for example: the size ofthe code, if altered, will be reflected in the initial code-size value supplied to the executing program being inco ⁇ ect ). Additionally, certain modification to the operatmg system and environment of said software can also be monitored (for example: certain interrupt vector table pointers in Intel-processor appUcations) to detect unexpected changes by rogue software. These changes can also be detected to prevent tampering.
- the five aspects described herein may be combined to provide substantiaUy stronger security than any aspect taken on its own.
- the precalculated check- data as derived during tamper-detection described hereinbefore may actuaUy be one part ofthe decryption-key which is required to successfuUy decrypt the remaining executable software.
- prevention-of-tracing and environment characteristics are additional portions of said decryption-key, it makes tiie determination of said decryption-key by any person or computer program other than tiie secure original an extremely difficult, if not impossible, task.
- Standard Intel x86 interrupts 1 and 3 are used by debuggers to facilitate code tracing. By utilising these interrupts (which are not normally used by normal applications) in security-enhanced software, it hampers debugging, since buih-in debugging functions are now not automatically available.
- Disabling the keyboard wiU hamper debuggers, since tracing instructions are usually issued from the keyboard. Similarly, disabling other places from where tracing instructions are usuaUy issued (eg: serial ports, printer ports, and mouse) or displayed (eg: screen) wiU also hamper tracing.
- tracing instructions are usuaUy issued (eg: serial ports, printer ports, and mouse) or displayed (eg: screen) wiU also hamper tracing.
- System interrupts can be re-vectored for use witiiin tiie secure software to perform tasks not usuaUy performed by those interrupts. Debuggers usually rely upon system interrupts also, so to do this would usuaUy disable or destroy a debugger being used to trace the software. Disabling interrupts and performing timing-sensitive instructions between them will further hamper debugging.
- tracing software instructions are usuaUy executed one-at-a-time in order for the user to understand tiieir operation. Many system interrupts must occur regularly (eg: timer and memory re-fresh operations), so debuggers usuaUy do not disable interrupts even when they encounter an interrupt-disabling instruction. If timers and the like are re-vectored in two separate stages, any timer (etc) interrupt occurring mbetween tiie two stages wiU fail, and usuaUy crash the computer.
- interrupts can be disabled or enabled using obscure means (with flag-altering instructions for example) to hamper tracing.
- the program stack is usually used by the debugger either during the tracing operations or at other times. This is easily detected, and by using the area ofthe stack which will be destroyed by unexpected stack-use for code or critical data, software can be designed to self- destruct in this situation.
- Scanning the command environment and the execution instruction can detect the execution of software by unusual means. Searching for "DEBUG" in tiie command line, or scanning memory for known debuggers for example wiU detect tracing. Additionally, by detecting which operating system process initiated the load ofthe software, unexpected processes (eg: debuggers) can be detected.
- Monitoring system buffers eg: the keyboard memory buffer
- hardware eg: the keyboard circuity and intemal buffers
- debuggers which usuaUy rely in part on system functions in order to operate.
- Code checksums and operating-system checks can be designed to detect debug-breakpoint instruction inserts or other modifications. Using the result ofthe checksum for some obscure purpose (eg decryption, or (much later) control-flow changes) will further hamper tracmg
- Aspect 5 Ensuring authenticity.
- a method of providing for a secure entry of ID-Data m a computer system compnsmg activating a visual display or animation and/or audio feedback (hereinafter called an audio/visual component) as part of said secure entry of ID-Data so as to hamper emulation of said secure entry process
- tiie animation m cludes feedback portions as part ofthe ID-Data entry process
- the animation is repeatable and vaned m accordance with the mformation entered
- the animation preferably compnses 2 5D or 3D animation and mcludes animation of any ID-Data mput
- the animation is designed to tax the computer resources utilised and thereby makmg any forgery thereof more difficult
- the user interface for the acquiring of ID- Data is secured whereby the duplication ofthe interface is rendered mathematically complex such that cipher-code breakmg techniques are required to produce a counterfeit look-ahke interface
- the authentication interface ie ID-Data entry screen - for example a logon screen or a screen for ente ⁇ ng credit card details
- the application program allows for a higher degree of secu ⁇ ty and authenticity even m insecure environments such as the Intemet or home software applications
- a rogue's "spoof program 22 is inserted between application software 16 and the user 23
- the apphcation 16 normaUy has a portion 24 devoted to ID-Data entry and ve ⁇ fication or the entry of commercially sensitive mformation (including passwords etc) to the application m addition to the apphcation code 25
- the spoof program 22 is designed to exactly reflect the presented user interface of ID-Data entry code 24 to the user
- the user 23 is then fooled mto utilising the masquerading spoof program 22 as if it was the apphcation 16
- the user can be tricked mto divulging secret mformation to the spoof program 22
- An example may mclude a classic "login spoof wherein the spoof program 22 prints the login prompt (ie: ID-Data entry) message on the screen and the user mistakes the login prompt for a legitimate one,
- rogue attack 40 there is iUustrated a relatively new form of rogue attack 40.
- This form of attack proceeds similarity to the spoof attack of Fig.2, with tiie foUowing difference.
- a rogue program 41 is inserted which secretly eavesdrops on ID-Data entry code 24, or on appUcation code 25, or on operating system 17, or on hardware 18 or elsewhere in order to steal sensitive information directly from tiie legitimate appUcation. Smce the legitimate appUcation is stiU actuaUy executing, the users suspicion is not aroused, since rogue program 41 is generaUy invisible to the user 23.
- executable program 16 may have been tampered with (as hereinbefore described) to reduce its security, aUeviating the necessity for tiie presence of rogue program 41.
- FIG.5 there is iUustrated in detail the structure of an appUcation 50 constructed in accordance with the preferred embodiment running on computer hardware 18.
- Fig.5 is similar to Fig.4 with tiie important difference that user 23 now communicates directly with secure drivers 51 which are part ofthe secure ID-Data entry program code 31 which is utilised by the security-enhanced (eg: tamper protected) application code 52. It can be seen that the user 23 no longer communicates with the operatmg system 17 or the unprotected computer hardware 18, thus the rogue program 41 can no longer eavesdrop on ID-Data.
- secure drivers 51 which are part ofthe secure ID-Data entry program code 31 which is utilised by the security-enhanced (eg: tamper protected) application code 52.
- Fig.3 there is iUustrated, in more general terms than Fig.5, the structure of an apphcation 30 constructed in accordance with the prefe ⁇ ed embodiment wherein secure ID-Data entry program code 31 is provided which is extremely difficult to replicate, eavesdrop upon or subvert.
- the secured ID- Data entry program code 31 can be created, utilising a number of different techniques.
- the executable portion ofthe secured ID-Data entry code can be protected agamst tracing, disassembly, tampering, viewing, reverse engineering, keyboard entry theft, eavesdropping, hot patching and other attacks by transforming the secured ID-Data entry program code 31 from its normal executable form 16 (Fig.2) to a corresponding secured form of executable (as hereinbefore described - refer aspects 1 to 4).
- These techniques are preferably appUed to the appUcation code 16 in general or less preferably SpecificaUy limited to the ID-Data entry portions 24 tiiereof.
- secure ID-Data entry program code 31 is itself created.
- This code 31 preferably comprises a complex graphical user interface series of screens and animation designed to make duplication by a rogue thereof extremely difficult.
- the complex user interface should include facilities to disable any frame buffer recording devices, the disablement occurring before each frame is displayed. Also, where a multi ⁇ tasking operating system is in use, or where context switching is enabled, switching out ofthe interface screen is preferably disabled or ID-Data entry procedures encrypted or terminated when the interface screen is swapped out.
- the images presented which form part ofthe ID-Data entry screens comprise complex 3D animation sequences having a high degree of complexity and extensive use of screen colours and screen resolution in addition to visual design so as to make copying tiiereof extremely difficult.
- the complex computer graphics can be created utilising standard techniques.
- Suitable 3D animation can mclude the introduction of shadows, the lighting of pseudo-3D animated objects, transparent or translucent objects, shiny, reflective, or mi ⁇ ored objects, gravitational effects in animated objects, single-image-random-dot- stereogram bitmaps or backdrops, translucent threads, effects, such as diffraction pattems, screen masks, backdrops, colour palette "animation", complex animated objects resistant to simple hidden- surface removal techniques known to those skilled in the art and directed to hindering duplication
- Thwarting attempts at compression ofthe ID-Data entry screens This can be achieved by having animation which has low visual entropy and having many graphical elements which are altered from frame to frame in a manner which is highly discernible to the human viewer.
- animation which has low visual entropy and having many graphical elements which are altered from frame to frame in a manner which is highly discernible to the human viewer.
- complex 3D computer imagery having low entropy or redundancy wiU require large amounts of storage space for a rogue attempt at duplication based on recording the screen output and therefore be more readily discernible to the user should this form of attack be mounted.
- the animation is further preferably designed to thwart a successful replay attack which is based on providing only a subset (limited number of frames) ofthe screen animation to a viewer. This can be achieved, for example, by the inclusion of several animated spheres which "bounce" around the screen and change colours in a manner that is recognisable to the viewing user but which is not readily repeatable.
- a replay of only a subset of tiie screen animations to the viewer wiU be highly evident in this case when, upon looping, the user is alerted to a problem when the animation "skips" or "jumps" and does not operate in a previously smooth manner. This makes it difficult for a rogue spoof program to copy the animation without including aU parts of it.
- the graphics presented can be customised to the input data entered.
- the information entered by a user can be rendered and or animated by the secure ID-Data entry program code 31 (Fig.3).
- the animation can be created letter by letter.
- each letter could be rendered differently depending on those characters previously typed.
- tiie letter "I” might appear as a large "barbers-pole” which spirals and changes colour, speed, size, and or position and is slightly transparent, thereby aUowing the animated seen which is a backdrop to the character to be discerned through the character itself.
- the letter "I” would only appear as the specific animated barbers pole that is does if tiie previous letters entered were "C", "H", and "R” respectively.
- a similarly effective animation technique is to produce only one graphical object after entry of each portion of ID-Data, sudi as a computer-generated human's face, but have the features of said face be determined by a hash or cryptographic function based upon the users input. For example, after entry ofthe ID-Data "CHRIS" (in tiiis example, the individual characters may not, themselves, be based on the abovementioned generation procedure) , a teenage girl's face with long blonde hair and blue eyes may be displayed. If the "S” was instead a "D", the face would be entirely different.
- the ID-Data used for producing an object for display should not be ID-Data which is designed not to appear on-screen when entered (eg: a password), since the display of a conesponding object would give a rogue information on which to base guesses ofthe secret ID-Data.
- the rogue programming the co ⁇ esponding spoof program shall have to crack the cryptographic scheme in order to get the selection of character animation correct for any generalised attack.
- a rogue will have to determine the algorithm for producing the face, since human beings are adept at recognising faces, and will immediately notice if the face displayed on the screen is incorrect.
- Such a technique allows for a mathematically secure, visual method to guarantee the authenticity ofthe software which generates the screen feedback.
- the user of the software is instructed to note their own particular animation sequence and to immediately discontinuing utilisation ofthe appUcation 30 should that sequence ever change.
- the user may also be instructed to contact a trusted person, such as the supplier or operator of tiie appUcation to confirm tiiat tiie animation sequence they witness is the authentic sequence intended by said supplier.
- the particular animation presented for a particular appUcation 30 can be further customised for each appUcation so as to be distinct (such as by the inco ⁇ oration ofthe applications name as part ofthe animated image).
- animated scene timing can be utilised, providing anti-looping and frame removal detection is stiU catered for.
- the animated scene timing aUows for a user to detect unexpected i ⁇ egularities in a frequently presented animated interface.
- some deUberate regularity such as the rhythmic convergence of some parts ofthe animation in one particular spot
- a rogue programming a spoof program shall also have to duphcate the preferably complex timing events necessary to accomplish tins convergence.
- the regular nature ofthe scene timing should be high enough so that tiie user expects to see certain events and thereby making it difficult for a rogue spoof program to copy the animation without including aU parts of it.
- aU ID-Data is immediately encrypted which makes recovery ofthe ID-Data by a rogue through analysis ofthe computer program memory difficult.
- public- key cryptographic methods eg: Elliptic-curve, RSA or Diffie-Hellman cryptography
- ti e cryptographic code to decrypt any sensitive mformation should rt be stolen in its encrypted form. Prohibiting all or most interrupts when data is to be entered and encrypting or hashing the sensitive information immediately so that it is only stored partially, or in an encrypted form, before re-enabling interrupts is one example of achieving this objective.
- analysis of a user's personal characteristics can be mcluded as part of the interface. This can include attempts at recognition of a user's typing style (duration of keypresses, delays between subsequent keys, choice of redundant keys, mouse usage characteristics, etc) or by additional authentication techniques, including smartcards, biometric inputs such as finger prints detectors etc.
- the graphical animation routines can be "watermarked” by the secure ID-Data entry program code in that "hidden” mformation may be inco ⁇ orated into the scene (for example "salted- checksums") to aUow careful analysis ofthe ou ⁇ ut of secure ID-Data entry program code 31 to distinguish between original graphics animation and counterfeit animation.
- the hidden information may be encoded in the least-significant bit of pixel data at selected locations ofthe animation.
- the user determinable sequence of animation can also extend to tiie provided audio animation .
- audio and other feedback techniques including music and speaking tones can be played in response to particular key stroke combinations.
- the security ofthe appUcation 30 can, once again, be substantiaUy increased.
- the change in voice intonation will be readily "learnt" by a user and thereby further inhibit a rogue's ability to duplicate the same sequence of sounds or voices .
- the encoding ofthe voice system should be in an encrypted form.
- a notification message is preferably sent to a prosecuting body or the like where the appUcation 30 is currently, or later becomes connected to a network such as the Intemet, or by other means (eg: via Modem or by including coded information in pubUc or other files).
- a secure means of activation can be inco ⁇ orated into the chart appUcation 30.
- the host and cUent intercommunication can issue chaUenge and response code authentication and verification utilising cryptographic systems such as public-key encryption and/or other standard means of overcoming data replay attacks and other threats designed to trick the secure client appUcation 30 into activation.
- the standard executable 16 normally comprises a header section 71, a code section 72, and a data section 73.
- the header section 71 normaUy stores a standard set of mformation required by the computer operatmg system 17 (Fig.l) for running ofthe executable 16. This can include relocation data, code size etc.
- the code section 72 is normally provided for storing the "algorithmic" portion ofthe code.
- the data section 73 normally is utilised to store the data, such as constants, or overlays 92 utilised by the code section 72.
- Fig.6 the prefe ⁇ ed embodiment of an applicator program 60 is shown which takes as its input the executable program 16 and performs an obfuscating step 61, a ciphering step 62 and an anti-key press and authentication step 63 (described hereafter) which perform various transformations on the executable program 16 to produce a new executable program 30.
- the obfuscating step 61 modifies the header 71 (Fig. 7) ofthe executable 16 in addition to inserting loading code which wiU be described hereinafter.
- the cipher step 62 encrypts tiie existing executable 16 and calculates check data (eg: a checksum) for the encrypted executable.
- the anti-key press and authentication step 63 replaces various insecure system caUs wrth safe equivalent code and preferably inserts code to graphicaUy represent the integrity of said executable program.
- the newly formed executable 30 (new.exe) can be then stored on disk and the appUcator program 60 completed, tiie new executable 30 replacing tiie old executable program 16.
- the replaced executable 30 executes the obfuscating code, previously inserted by appUcator 60.
- the obfuscating code initiaUy decrypts the executable program and vaUdates the stored check-data before re-executing the decrypted executable.
- FIG.9 there is shown the structure 90 of an executable ".EXE" program in
- MS-DOS as normally stored on disk.
- the structure 90 mcludes a header 71, otherwise known in MS-DOS terminology as the program segment prefix (PSP).
- PSP program segment prefix
- This is normally foUowed by a relocation table 91 which contains a Ust of pointers to variables within a code area 72 which must be updated wrth an offset address when the program is loaded into a particular area of memory.
- the operation ofthe relocation table is weU known to those skiUed in the art of systems prograrnming.
- the next portion of structure 90 is tiie code area 72 which contains the machine instructions for operation on tiie x86 microprocessor.
- This is foUowed by a program data area 73 which contams the data for code area 72.
- overlays 92 which contain code which can be utihsed in a known manner.
- Fig.8 there is shown the structure of EXE file header 71 in more detail.
- the table of Fig.8 being reproduced from page 750 ofthe above mentioned Tischer reference.
- the header 71 mcludes a number of fields including, for example, a pointer 81 to the start of the code 72 (Fig. 7) and a pointer 82 to the relocation table 91 (Fig.9).
- the appUcator program 60 (Fig. 6) proceeds by means ofthe foUowing steps:
- the executable program 16 is opened for reading and a determination made of its size.
- the relocation table 91 is then read into the memory of tiie appUcator program 60.
- tiie relocation table 91 consists of a series ofthe pointers to positions within code segment 72 which are required to be updated when loading the program.exe file into memory for execution.
- the relocation table is sorted 93 by address before being written out to the new.exe executable file at position 102.
- the relocation table 91 consists of a series of pointers into code area
- code portion 95 Upon finding code portion 94, the code portion 95, also denoted part A is encrypted and copied across to form new code portion 103. Code portion 94 is then encrypted and copied to an area 105 of new.exe 30. The netsafe 1 code 104 is then inserted by appUcator 60. Code portion 96, also denoted part C is encrypted and copied across to form code portion 106. Data portion 73 and overlay portion 92 are copied into new.exe 30 as shown. A second portion of obfuscating code, denoted "netsafe 2" 107, the contents of which will be described hereinafter, is then inserted after overlays 92 and before code portion part B 105.
- netsafe 2 A second portion of obfuscating code
- header 101 is then updated to reflect the altered layout ofnew.exe executable 30. AdditionaUy, the initial address 109 of execution stored in header 101 is altered to be the start of netsafe 1 portion 104.
- code portions 103, 106 and 105 are subjected to encryption or encipherment in accordance wrth step 62 of Fig.6.
- the encryption scheme utilised can be subjected to substantial variation.
- the DES standard encryption scheme was utilised This scheme reUes on a fifty-six bit key for encryption and decryption and is weU known in tiie art.
- a number of different methods can be utilised to store the key.
- the prefe ⁇ ed method is to spread portions ofthe key to different positions within the executable 30.
- bits ofthe key can be stored within the netsafe 1 code 104 and netsafe 2 code 107.
- bits ofthe key can be stored within header portion 101.
- bits ofthe key can be stored in tiie condition codes which are a consequence of execution of various instructions within netsafe 1 area 104 and netsafe 2 area 107 and/or tiie operating system 17 (Fig.5), with the overaU requirement being that the key can be later extracted using a predetermmed algorithm.
- the next step is to patch tiie address ofthe start of code area 72 and netsafe 2 code area 107 into the required locations within netsafe 1 area 104.
- the netsafe 1 area is then written to the file containing new.exe executable 30.
- netsafe 2 area 107 is responsible for loading code portion 105 over tiie top of netsafe 1 area 104. Therefore, it is necessary to write the relevant addresses ofthe start and end of code portion 94 to the required position within netsafe 2 area 107.
- netsafe 2 area 107 is also responsible for decrypting the encrypted portions of codes 103, 104, 105, 106, and 107 and hence tiie netsafe 2 area 107 must also store this combined code size for later use on decryption .
- netsafe code areas 104 and 107 contain code to decrypt the encrypted areas ofthe new.exe 30, to repatch code portion 105 back to its original position, and to replace potentiaUy insecure routines or eas ⁇ y spoofed screens normaUy utilised by the apphcation (eg: unsafe keyboard drivers) wrth an altemative safe form of routine.
- the executable Upon execution of tiie new.exe executable 30, the executable starts at the start of netsafe 1, area 104 (Fig.11), as this address has been previously patched into position 109 (fig.10) of header 101 (Fig.10). The netsafe 1 area 104 then performs the followmg steps (Al) to (AIO):
- the first step is to disable aU the interrupts apart from those necessary for continued operation ofthe computer device 18 (Fig. 1) (for example, memory refresh cannot be disabled).
- the disabling of interrupts mcludes the disabling ofthe keyboard interrupt in order to stop amateur "code snoopers" from determinmg the operation ofthe code area 104.
- interrupt trap addresses are then altered in a two stage process.
- the first stage resets a first part ofthe SEG:OFF address format and occurs at this point with a second stage occurring at a later time as will be further described herein below.
- any code snooper will be further confused as said trap addresses wiU initiaUy be garbage.
- Fig. 11 there is shown the standard format ofthe executable new.exe 30 when executing in memory.
- an executing program under the MS-DOS system will include a stack 111 and work space 112.
- a memory allocation (MaUoc) call is then done to set aside an area 113 for the loading in of tiie netsafe 2 code 107 of Fig.10.
- the disk copy ofnew.exe 30 (having the format shown in Fig.10) is tiien opened by the netsafe 1 code 115 and an encrypted copy of netsafe 2 code 107 (Fig.10) is then loaded in from the disk file, decrypted and stored in memory area 113.
- the relocatable pointers of he code contained within the netsafe 2 code 113 are then updated to reflect tiie position ofthe executable in memory.
- the code area netsafe 2, 113 then performs the followmg steps (Bl) to (B4):
- the netsafe 2 area 113 includes a number of keyboard routines which are preferably stored in an encrypted format. Therefore, the next step is to apply the decryption to any ofthe encrypted areas of netsafe 2 code area 113. After decryption, the netsafe 2 area 113 is checksummed and the result is tested against a prestored checksum to ensure tiie integrity of netsafe 2 area 113. (B3) The disk copy ofthe new.exe is then again read in and checked agamst prestored check data to ensure that it has not been changed. Additionally, an attempt is made to read past the end of file ofthe disk copy ofnew.exe 30 (Fig.10) to ensure that no extension (eg: viral) has occu ⁇ ed.
- no extension eg: viral
- keyboard routines 51 wrth the keyboard hardware being inte ⁇ ogated directly by keyboard routines 51 to retum mformation to the calling program.
- Keyboard routines 51 m clude a copy ofthe co ⁇ ect interrupt vector addresses for each keyboard routine and each time they are called, a check is made ofthe interrupt table to ensure that it has not been altered.
- keyboard routines 51 protect the keyboard hardware by issuing controller reset or similar commands to flush the keyboard data out ofthe circuitry after said data is retrieved to prevent hardware eavesdroppmg, or routines 51 utilise the protected mechanisms ofthe central processor to protect said hardware from eavesdroppmg.
- interrupt 21h an MS-DOS standard
- This interrupt is also revectored to a co ⁇ esponding area of routines 51.
- the termination code of keyboard routine area 51 restores the correct interrupt pointers in interrupt table 131 to point to the MS-DOS operating system 17, and clears the no-longer-needed program and data from memory before returning to the DOS operatmg system by calling the real interrupt 21.
- Claims 1,2, and 3 are independent.
- the invention in claim 1 covers any high security software protecting ID-Data by utihsing anti-spy techniques, and tamper-protecting itself.
- Claim 2 is for a metiiod of producing high security software, such as, but not Umited to, that in claim 1.
- Claim 3 is for a new process of graphically representing the authenticity of high security software, such as, but not Umited to, that in claim 1 or produced by claim. 2.
- Claims 4, 5, 6, 7, 8, and 9 add prefe ⁇ ed components to the high-security enforcing functions of the software in claim 1.
- Claim 10 adds a tracing-prevention prefe ⁇ ed component to claim. 9
- counterfeiting i.e.: hampering the possibility that a fake copy of said interface can be successfully presented to a user to fool said user into entering information into the fake interface
- an interface against malicious (or otherwise) tampering, examination, emulation, and eavesdropping i.e.: hampering the possibility that a fake copy of said interface can be successfully presented to a user to fool said user into entering information into the fake interface
Abstract
A software-based computer security enhancing process and graphical software-authenticity method, and a method to apply aspects of the two are disclosed. The process provides protection against certain attacks on executable software by persons or other software used on the computer. Software using this process is protected against eavesdropping (the monitoring of software, applications, the operating system, disks, keyboard, or other devices to record (steal) identification, authentication or sensitive data such as passwords, User-ID's, credit-card number and expiry dates, bank account and PIN numbers, smart-card data, biometric information (for example: the data comprising a retina or fingerprint scan), or encryption keys), local and remote tampering (altering software to remove, disable, or compromise security features of the altered software) examination (viewing the executable program, usually with the intent of devising security attacks upon it), tracing (observing the operating of an executable program step-by-step), and spoofing (substituting counterfeit software to emulate the interface of authentic software in order to subvert security) by rogues (e.g.: Trojan Horses, Hackers, Viruses, Terminate-and-stay-resident programs, co-resident software, multi-threaded operating system processes, Worms, Spoof programs, key-press password captures, macro recorders, sniffers, and other software or subversions). Aspects include executable encryption, obfuscation, anti-tracing, anti-tamper and self-verification, runtime self-monitoring, and audiovisual authentication (math, encryption, and graphics based method permitting users to immediately recognise the authenticity and integrity of software). The figure in the specification depicts the many components and their interaction.
Description
COMPUTER SOFTWARE AUTHENTICATION, PROTECTION, AND SECURITY SYSTEM
BACKGROUND OF THE INVENTION
The present mvention relates to a computer program havmg enhanced security features, and also to a system and method for enhancing the security features of a computer program In particular, the present mvention relates to such a program, and the system and method for creatmg the program, havmg mcreased security features to prevent ID-Data (as defined hereafter) eavesdroppmg and/or theft and/or to ensure authenticity
DESCRIPTION OF THE PRIOR ART
Computers are becoming widely interconnected and heavily relied upon to process and store sensitive information The risk of unauthorised access to computers and information has mcreased with this mcreased mterconnectivity
Many secuπty advances exist m the areas of identification & authentication of users, cryptography, virus prevention, and the like, however - almost all of these advances ultimately rely upon computer software Most computer systems are, or are accessed by, small personal computers, and most software used on these personal computers is susceptible to "local attacks" - attacks which are mounted from mside said personal computers agamst said software by other software or people
Passwords, User-ID's, credit-card numbers and expiry dates, bank account and PIN numbers, smart-card data, biometric mformation (for example the data compnsmg a retina or fingeφrint scan), cryptographic keys, and the like are all examples of identification, authentication or similar data which is either sensitive m itself, or may allow access to sensitive, restncted or other information or services Hereafter, the term ID-Data will be used to refer to the abovementioned identification, authentication or similar data, excluding ID-Data which is valid only for a smgle use, or which is designed to expire at regular intervals of less than two mmutes
Illegal access to computer system information can be obtamed by exploiting vaπous secuπty flaws found m computer software products A common flaw is the susceptibility of said software to the theft of ID-Data either directly from said software as it executes, or from the operatmg system or hardware on which said software is executing Another common flaw is the susceptibility of said software to illegal modification Such modifications may remove, disable, or compromise the secuπty features of said software
Viruses, Termmate-and-stay-resideπt programs (TSRs), co-resident software, multithreaded operatmg system processes, Trojan Horses, Worms, Hackers, Spoof programs, key-press password capturers, macro-recorders, sniffers, and the like can be effective at stealing ID-Data and are examples of (a) rogue software or (b) people capable of subverting secuπty software or (c) software
which can be configured for illegitimate purposes. Hereafter, the term rogue software will be used to refer to software or subversions such as the abovementioned (a) (b) and (c), used for the purpose of stealing ID-Data. The definition of our term "rogue software" when used herein also includes software or other means used to tamper with other software. The term tampering is defined hereafter. There are many ways to introduce rogue software into a computer system. Viruses spread automatically by introducing themselves. Trojan-Horses are usually introduced by tricking users into allowing them to execute (such as by masquerading as a new or well-known computer game or other product). Existing security problems may be utilised to introduce rogue software; some well known problems mclude Java bugs, errors, or oversights, ineffective physical security (for example: permitting rogue software to be introduced directly on floppy disk by an intruder), electronic mail attachments which automatically execute or execute after a simple mouse-click, incorrect security settings on intemet, world-wide-web, TCP/IP or modems, and tampering (see definition hereafter) with legitimate software in-transit as it flows from remote internet sites into a users computer, to name a few.
Rogue software, once introduced, can steal ID-Data as mentioned hereinbefore. It may monitor keyboard (for example: by recording every key, as the user presses each one, in order to steal a password as it is being typed in), serial-port, mouse, screen, or other devices to steal ID-Data directly from them. It may monitor other software, applications, the operating system, or disks to steal ID- Data from there also. Once stolen, this ID-Data may be stored locally (for example: in memory or on- disk) or transmitted to remote locations (for example: by modem or network) or used immediately to perform illegal operations. Hereafter, the term eavesdropping will be used to refer to the monitoring of a computer to record ID-Data.
For example, a key press recorder could secretly, and unbeknown to the computer user, record all the keys pressed by the user into a hidden systems file. The information recorded could mclude a user's password and other sensitive information which an organisation would obviously wish to protect.
Additionally, rogue software may remove, disable, or compromise existing computer software security features by modifying the memory, disk, or other image of said computer software. Rogue software may also utilise tampering techniques to alter existing computer software in order to steal ID- Data from it, or may attach itself to existmg computer software (as is the case with many conφuter viruses). Hereafter, the term tampering will be used to refer to die abovementioned modification of computer software. Tampering may take place either locally (within a users PC) or remotely (for example: at one ofthe points which a computer program passes through as it is being download).
Further, counterfeit software can be substituted for legitimate software. The counterfeit will appear real to a computer user, but actuaUy acts to subvert security, such as by stealing ID-Data. Sometimes called "Spoof" programs or Trojan Horses, counterfeit software of this type may invoke the original legitimate software after having stolen ID-Data, so as not to arouse a users suspicion.
Another potential security flaw found in computer software products is susceptibility to examination and reverse-engineering. Known (but generally secret) and other security problems or
mistakes can be discovered by hackers and the like from the examination of existing computer software and by tracing its operation.
Additionally, Computer software piracy is a growing problem, and the existmg simple means which prevent this problem (such as registration or serial numbers and customer-names being encoded within the product) are becoming less effective.
There is necessity within the try-before-you-buy software market for vendors to employ effective features which allow old software to expire without fear of hackers or the like removing said expiry features and for secure registration of software to be provided through the use of software unlock-codes
There is also need for software to be able to prevent security attacks upon itself (ie: tampering) and upon its own attack-detection code. There may also be a future need for software to identify the attacker for subsequent prosecution.
There also exists cases where untamperable software usage metering may be desirable, and where effective password-protection of software execution may also be desirable.
Known advances in certain areas of computer security have been successful and documented.
There have been some advances in anti-virus technology which help detect and prevent certain security problems. There have been numerous advances in hardware-assisted computer security add-ons and devices, such as smartcards and biometric input devices. There have been advances in cryptographic techniques. Generally, all of these advances require authentic, un^tampered-with computer software in order to work. There have been relatively few advances in software-based integrity self-checking (eg: tamper protection), and no prior software-based advances in preventing eavesdroppmg or the electronic theft of ID-Data, and no prior software-based advances in self-authentication.
SUMMARY OF THE INVENTION
This invention describes a process which substantially enhances the security of computer software (hereafter refeπed to as the improved process) and a method by which to apply said improved process (hereafter referred to as the applicator).
The improved process consists of including computer code to automatically detect tampering of said computer software, and computer code to prevent the theft of ID-Data by replacing existmg vulnerable (to rogue software eavesdropping or attack) software or operating system code with secure equivalents which utilise anti-spy techniques (as described later in this document).
Preferably, the improved process also consists of including computer code to prevent de- compilation, reverse-engineering, and disassembly by the inclusion of obfuscating code inserts, and the use of executable encryption.
Preferably, the improved process also consists of includmg code to prevent execution-tracing
and debugging by the use of code designed to detect and prevent these operations.
Preferably, the improved process consists of, or also includes, human-recognisable audio-visual components which permit the authenticity of said computer software to be easily verified by the user on each invocation using techniques described later in this document.
The idea which lead to the creation of this invention can be summarised as follows:- If a piece of computer software that is executing can be shown to be the genuine article, and this software can protect itself against eavesdropping, and this software can prevent tampering of itself, then is it possible for this software to function in a secure manner, even within an insecure operating system. This mvention permits the creation of such a piece of computer software - having a tangible, useful security advantage and hence improving its value.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig.1 illustrates the standard operation of a computer system known in the prior art;
Fig.2 illustrates the known operation of a rogue or "spoof program; Fig.3 illustrates apphcation code updated with the prefeπed embodiment;
Fig.4 illustrates the known operation of a rogue eavesdropping program;
Fig.5 illustrates the interaction ofthe components ofthe updated application;
Fig.6 illustrates the general structure ofthe prefeπed embodiment ofthe applicator,
Fig.7 illustrates a standard layout for a program to be executed on a computer system; Fig.8 illustrates the standard layout of an EXE header under the MS-DOS operating system.
Fig.9 illustrates a standard layout of an EXE program under MS-DOS;
Fig.10 illustrates an altered executable form constructed in accordance with the specific embodiment;
Fig.11 illustrates a first stage of execution ofthe new.exe executable;
Fig.12 illustrates a second stage of execution ofthe new.exe executable file; Fig.13 illustrates a third stage of execution ofthe new.exe executable file.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
As will be described hereinafter, the present invention has general applicability to many different operating systems including Microsoft DOS (Trade Mark), Apple Macintosh Operating Svstem, Unix OTrade Mark) etc.
Described hereafter are several security-enhancing techniques to combat eavesdropping. Security is provided by (a) hampering examination of software-code or operatmg system code or parts thereof through the use ofthe encryption or partial encryption of said code, (b) preventing the disassembly of said code mrough the inclusion of dummy instructions and prefixes and additional code to mislead and hamper disassembly (ie: obfuscating inserts), (c) preventing the computerised tracing of the execution of said code (for example: with code debugging tools) through the use of instructions to detect, mislead, and hamper tracing, (d) preventing tampering of said code through the use of scanning to locate alterations, either or both on-disk and in memory either once at the start of execution, or continuously upon certain events, or (e) preventing ID-Data theft through the inclusion of secure
input output routines (for example: routines to bypass the standard operating system keyboard calls and use custom-written higher-security routines as a replacement) to replace insecure computer- system routines. Hereafter, the term anti-spy will be used to refer to any combmation of one or more ofthe abovementioned techniques [(a) through (e) or parts thereof] used to prevent eavesdropping.
Referring now to Fig.1 there is illustrated the standard scenario for "running" a given executable program 16, under the control of a computer operating system 17 on a computer IS. In the preferred embodiment ofthe present invention, die executable program 16 is subjected to modification, as will be described hereinafter, to ensure its integrity and improve its security.
There are five aspects of this inventions improved process, although said process is still substantially improved even if not all of them are present. These aspects are: (1) Preventing eavesdropping (2) preventing disassembly and examination (3) detecting tampering (4) preventmg execution-tracing and (5) ensuring authenticity.
The prefeπed embodiment of these aspects ofthe present invention will now be described.
Aspect 1. Preventing eavesdropping.
As hereinbefore described, it is desirable to prevent rogue software from eavesdroppmg on ID-
Data. By replacing software which is vulnerable to eavesdropping with equivalent software which is far more secure, this purpose is achieved. To remove the vulnerability from said equivalent software, replacement routines may communicate directly with the hardware ofthe computer (for example, they may communicate with the keyboard circuitry instead of using the system-supplied (and hence possibly insecure) application interface keyboard-entry function-calls.) while disabling system interrupts which would permit rogue software to eavesdrop. Said replacement routines are coded to store ID-Data retrieved in a secure manner. ID-Data is not stored in full in plaintext (ie: unencrypted) in system or apphcation buffers.
Aspect 2 Preventing disassembly and examination. As hereinbefore described, it is desirable to hamper disassembly (or de-compilation or reverse engineering) to protect software against eavesdroppmg and tampering, and to hinder examination of said software which might lead to secret security problems or mistakes being disclosed.
Obfuscating inserts can successfully prevent automatic disassembly. Obfuscation is achieved by foUowing unconditional jump instructions (for example, Intel IMP or CLC/JNC combmation or CALL (without a retum expected) or any flow-of-control altering instruction which is known not to return to the usual place) with one or more dummy op-code bytes which wiU cause subsequent op¬ codes to be erroneously disassembled (for example, the Intel OxEA prefix wiU cause disassembly of the subsequent 4 op-codes to be inconect, displaying them as the offset to the JMP instruction indicated by the OxEA prefix instead ofthe instructions they actuaUy represent).
Dummy instructions may also be mcluded to hamper disassembly by deliberately misleading a disassembler into believing a particular flow of control wiU occur, when in fact it wiU not.
Flow of control can be designed to occur based upon CPU flag values determined from instructions executed a long time ago. Together with tracing prevention, this makes manual disassembly nearly impossible.
The majority ofthe executable portions ofthe software can be encrypted for extemal storage. The decryption taking place in-memory after the software is loaded from extemal sources, under the control of a decryption 'header" which prevents its own tampering and disassembly etc. This makes manual and automatic disassembly nearly impossible, since the decryption should be designed to fail if tampering or tracing is detected.
Aspect 3 Detecting tampering.
As hereinbefore described, it is desirable to detect tampering, since this may lead to the reduction of software security.
This can be achieved wiui the use of code which is protected from disassembly and examination through obfuscation and encryption, which re-reads its own external-image and compares it with its known memory image or precalculated check-data to detect hot-patching (ie: tiie modification of software sometime after it has been loaded from disk, but (usually) before execution ofthe modified section has commenced).
Additionally, the software can scan the memory image of itself one or more times, or continuously, to ensure that unexpected alterations do not occur.
Certain modifications to the extemal copy of software are reflected in subtle changes to the environment in which the modified software will be executed (for example: the size ofthe code, if altered, will be reflected in the initial code-size value supplied to the executing program being incoπect ). Additionally, certain modification to the operatmg system and environment of said software can also be monitored (for example: certain interrupt vector table pointers in Intel-processor appUcations) to detect unexpected changes by rogue software. These changes can also be detected to prevent tampering.
Once tampering is detected, program flow-of-control needs to be changed so that the potential compromise associated with ID-Data theft is avoided. This may be the security-enhanced program terminating with a message indicatmg that its integrity has been compromised before aU ofthe ID- Data is entered. Altematively, the fact that tampering has been detected may be kept secret and the ID-Data retrieved, however, immediately upon retrieval, the ID-Data entered can be invalidated thus preventing access to that which the now potentially compromised ID-Data would have otherwise aUowed. This latter method aUows for the possibility of security-enhanced software informing remote or other authorities that tampering was detected and possibly other information, such as what specifically was altered and by whom. Care must be taken to ensure the integrity ofthe "remote- informing" code before ID-Data entry is permitted.
ft will be apparent to one skiUed in the art of low-level software programming that the five aspects described herein may be combined to provide substantiaUy stronger security than any aspect
taken on its own. For instance, to combine tamper-detection with encryption, the precalculated check- data as derived during tamper-detection described hereinbefore may actuaUy be one part ofthe decryption-key which is required to successfuUy decrypt the remaining executable software. If prevention-of-tracing and environment characteristics (includmg debugger detection as described hereafter) are additional portions of said decryption-key, it makes tiie determination of said decryption-key by any person or computer program other than tiie secure original an extremely difficult, if not impossible, task.
Further, it will also be apparent to one skiUed in the art of low-level software programming that a simple construct such as a JNE to alter program flow-of-control after tampering has been detected is insufficient, since tiie JNE construct itself is subject to tampering. The denryption process described hereinbefore is preferable since there is no single point of alteration that can possibly yield a tampered executable that would execute. Indeed, the executable protected with encryption wiU not even be transformed into its intended form if tampering is detected.
Aspect 4 Preventing execution-tracing.
Apart from "spoofing" (described in aspect 5 hereafter) the last resort of a rogue who is prevented from disassembly, tampering, and eavesdropping on software is to trace the execution of said software in order to facilitate the compromise of its security. Hampering tracing (tracing is sometimes called debugging) prevents this.
There are numerous methods of detecting a debug-environment (ie: when tracing is taking place). When combined with decryption and tamper-protection as hereinbefore described, it makes the rogues task of detecting and bypassing debug-detection extremely difficult. Reference and examples to Intel and MS-DOS environments foUow hereafter, although it wiU be apparent to one skilled in the art that these and similar methods are apphcable on other platforms.
Standard Intel x86 interrupts 1 and 3 are used by debuggers to facilitate code tracing. By utilising these interrupts (which are not normally used by normal applications) in security-enhanced software, it hampers debugging, since buih-in debugging functions are now not automatically available.
Monitoring the system timer to determme if software execution has spent too long accomplishing certain tasks can detect a situation where code tracing has been in effect and a breakpoint was reached.
Disabling the keyboard wiU hamper debuggers, since tracing instructions are usually issued from the keyboard. Similarly, disabling other places from where tracing instructions are usuaUy issued (eg: serial ports, printer ports, and mouse) or displayed (eg: screen) wiU also hamper tracing.
System interrupts can be re-vectored for use witiiin tiie secure software to perform tasks not usuaUy performed by those interrupts. Debuggers usually rely upon system interrupts also, so to do this would usuaUy disable or destroy a debugger being used to trace the software.
Disabling interrupts and performing timing-sensitive instructions between them will further hamper debugging. When tracing software, instructions are usuaUy executed one-at-a-time in order for the user to understand tiieir operation. Many system interrupts must occur regularly (eg: timer and memory re-fresh operations), so debuggers usuaUy do not disable interrupts even when they encounter an interrupt-disabling instruction. If timers and the like are re-vectored in two separate stages, any timer (etc) interrupt occurring mbetween tiie two stages wiU fail, and usuaUy crash the computer.
Further, interrupts can be disabled or enabled using obscure means (with flag-altering instructions for example) to hamper tracing.
Discretely testing the status of disabled or enabled system facilities (eg: interrupts, keyboard, vector-pointers) to ensure that a debug-environment has not altered or by-passed them wiU seriously hamper tracing also.
Certain computer processors have instruction caches. In some circumstances, it is possible to aker the instructions immediately before the CPU encounters them, but the altered instruction will not be executed normaUy because the cache copy has tiie "old" one stiU. In debug environments, the cache is usuaUy flushed, so any altered instructions wiU actuaUy be executed. This again hampers tracing.
Using strong cryptographic schemes, such as DES, or RSA or the like will prevent the examination of any decryption routines from revealing a simple patch to disable said routines.
When tracing software, the program stack is usually used by the debugger either during the tracing operations or at other times. This is easily detected, and by using the area ofthe stack which will be destroyed by unexpected stack-use for code or critical data, software can be designed to self- destruct in this situation.
Scanning the command environment and the execution instruction can detect the execution of software by unusual means. Searching for "DEBUG" in tiie command line, or scanning memory for known debuggers for example wiU detect tracing. Additionally, by detecting which operating system process initiated the load ofthe software, unexpected processes (eg: debuggers) can be detected.
Monitoring system buffers (eg: the keyboard memory buffer) or hardware (eg: the keyboard circuity and intemal buffers) for unexpected use (eg: keyboard input and processing is occurring when the software is not requestmg it) will also detect debuggers, which usuaUy rely in part on system functions in order to operate.
Building a process or multiple processes which are traditionally difficult to trace, such as a resident or child process which executes during system interrupts or after the parent process has terminated will again hamper tracing.
Bypassing system routines (eg: in DOS, using direct memory writes instead of DOS system caUs to revector interrupts) wiU further hamper debugging and rogue software monitoring, as will unravelling loop constructs (which wiU make tracing long and cumbersome).
Code checksums and operating-system checks (eg: interrupt table pointers) can be designed to detect debug-breakpoint instruction inserts or other modifications. Using the result ofthe checksum
for some obscure purpose (eg decryption, or (much later) control-flow changes) will further hamper tracmg
It will be apparent to one skilled m the art of low-level software programming that a combination of techmques to detect, prevent, and mislead tracmg wiU provide a mechamsm makmg tracmg very difficult, if not impossible At tiie very least, it wiU require an expert with very expensive tools and perhaps some understanding of he oπgmal software design a very long time to make any debugging progress - a situation which is recognised m military software secuπty accreditation worldwide as highly desirable
Aspect 5 Ensuring authenticity.
In accordance with an aspect ofthe present mvention there is provided a method of providing for a secure entry of ID-Data m a computer system compnsmg activating a visual display or animation and/or audio feedback (hereinafter called an audio/visual component) as part of said secure entry of ID-Data so as to hamper emulation of said secure entry process
Preferably, tiie animation mcludes feedback portions as part ofthe ID-Data entry process
Preferably, the animation is repeatable and vaned m accordance with the mformation entered
The animation preferably compnses 2 5D or 3D animation and mcludes animation of any ID-Data mput
Preferably, the animation is designed to tax the computer resources utilised and thereby makmg any forgery thereof more difficult
Notwithstanding any other forms which may faU within the scope ofthe present mvention, prefeπed forms ofthe mvention wiU now be descnbed, by way of example only, with reference to the accompanying drawings
In the prefeπed embodiment ofthe present mvention the user interface for the acquiring of ID- Data is secured whereby the duplication ofthe interface is rendered mathematically complex such that cipher-code breakmg techniques are required to produce a counterfeit look-ahke interface By makmg the authentication interface (ie ID-Data entry screen - for example a logon screen or a screen for enteπng credit card details) unable to be emulated, tampered with, or reversed engineered, the application program allows for a higher degree of secuπty and authenticity even m insecure environments such as the Intemet or home software applications
Referring now to Fig 2, there is iUustrated a classic form of rogue attack on a computer system In this form of rogue attack, a rogue's "spoof program 22 is inserted between application software 16 and the user 23 The apphcation 16 normaUy has a portion 24 devoted to ID-Data entry and veπfication or the entry of commercially sensitive mformation (including passwords etc) to the application m addition to the apphcation code 25 The spoof program 22 is designed to exactly reflect the presented user interface of ID-Data entry code 24 to the user The user 23 is then fooled mto utilising the masquerading spoof program 22 as if it was the apphcation 16 Hence the user can be tricked mto divulging secret mformation to the spoof program 22 An example may mclude a classic
"login spoof wherein the spoof program 22 prints the login prompt (ie: ID-Data entry) message on the screen and the user mistakes the login prompt for a legitimate one, supplying a user name and password to this program 22 which records this mformation as weU as passing it on to the login code
24 of appUcation 16 so as not to arouse the suspicion of user 23 - or by issuing a message, such as "incorrect password, please try again" and tiien passing control to the login code 24 of appUcation 16.
Referring now to Fig.4, there is iUustrated a relatively new form of rogue attack 40. This form of attack proceeds similarity to the spoof attack of Fig.2, with tiie foUowing difference. Instead of a spoof program 22, a rogue program 41 is inserted which secretly eavesdrops on ID-Data entry code 24, or on appUcation code 25, or on operating system 17, or on hardware 18 or elsewhere in order to steal sensitive information directly from tiie legitimate appUcation. Smce the legitimate appUcation is stiU actuaUy executing, the users suspicion is not aroused, since rogue program 41 is generaUy invisible to the user 23. Altematively, executable program 16 may have been tampered with (as hereinbefore described) to reduce its security, aUeviating the necessity for tiie presence of rogue program 41.
In Fig.5, there is iUustrated in detail the structure of an appUcation 50 constructed in accordance with the preferred embodiment running on computer hardware 18. Fig.5 is similar to Fig.4 with tiie important difference that user 23 now communicates directly with secure drivers 51 which are part ofthe secure ID-Data entry program code 31 which is utilised by the security-enhanced (eg: tamper protected) application code 52. It can be seen that the user 23 no longer communicates with the operatmg system 17 or the unprotected computer hardware 18, thus the rogue program 41 can no longer eavesdrop on ID-Data.
In Fig.3, there is iUustrated, in more general terms than Fig.5, the structure of an apphcation 30 constructed in accordance with the prefeπed embodiment wherein secure ID-Data entry program code 31 is provided which is extremely difficult to replicate, eavesdrop upon or subvert. The secured ID- Data entry program code 31 can be created, utilising a number of different techniques.
Firstly, the executable portion ofthe secured ID-Data entry code can be protected agamst tracing, disassembly, tampering, viewing, reverse engineering, keyboard entry theft, eavesdropping, hot patching and other attacks by transforming the secured ID-Data entry program code 31 from its normal executable form 16 (Fig.2) to a corresponding secured form of executable (as hereinbefore described - refer aspects 1 to 4). These techniques are preferably appUed to the appUcation code 16 in general or less preferably SpecificaUy limited to the ID-Data entry portions 24 tiiereof.
Additionally, the secure ID-Data entry program code 31 is itself created. This code 31 preferably comprises a complex graphical user interface series of screens and animation designed to make duplication by a rogue thereof extremely difficult.
Initially, the complex user interface should include facilities to disable any frame buffer recording devices, the disablement occurring before each frame is displayed. Also, where a multi¬ tasking operating system is in use, or where context switching is enabled, switching out ofthe interface screen is preferably disabled or ID-Data entry procedures encrypted or terminated when the interface screen is swapped out. The images presented which form part ofthe ID-Data entry screens
comprise complex 3D animation sequences having a high degree of complexity and extensive use of screen colours and screen resolution in addition to visual design so as to make copying tiiereof extremely difficult.
The complex computer graphics can be created utilising standard techniques. For mformation on how to create complex 3D imagery, reference is made to "Computer Graphics, Principles and
Practice" by Foley, Van Dam et al, pubbshed 1990 by Addison-Wesley Publishing Company or other standard textbooks on generation of computer graphics . Reference is also made to the numerous intemet news groups and archives on graphics and games programming, SpecificaUy to: comp .graphics. research, comp.graphics. rendering, comp.graphics. raytracing, comp. graphics. misc, comp.graphics.digest, ccinp.graphics. animation, comp.graphics. algorithms, comp.graphics, ah. graphics pixutils, ah.graphics, rec.games.programmer, comp. sys programmer, comp.sys.ibm.programmer, comp.sys.ibm.pc.programmer, comp os.msdos programmer, comp.msdos.programmer, ak.msdos.programmer. Reference is also made to "PC Games Programmers Frequently Asked Questions" document avauable on the intemet, via rec.games.programmer and elsewhere.
By encodmg a complex 3D image which forms part ofthe ID-Data entry screens, the hurdle requirement of a rogue to reverse engineer the complex imagery is substantiaUy mcreased. The inclusion of graphical animation is advantageous in preventing static screen shot duplication attacks by a rogue form succeeding.
As noted above, it is preferable that traditionally difficult graphical programming techniques are employed wherever possible, with the aim of making it more detectable for a user interacting with the system to discern lesser copies ofthe animation. Suitable 3D animation can mclude the introduction of shadows, the lighting of pseudo-3D animated objects, transparent or translucent objects, shiny, reflective, or miπored objects, gravitational effects in animated objects, single-image-random-dot- stereogram bitmaps or backdrops, translucent threads, effects, such as diffraction pattems, screen masks, backdrops, colour palette "animation", complex animated objects resistant to simple hidden- surface removal techniques known to those skilled in the art and directed to hindering duplication
Further, the animation can take into account:
1. Thwarting attempts at compression ofthe ID-Data entry screens. This can be achieved by having animation which has low visual entropy and having many graphical elements which are altered from frame to frame in a manner which is highly discernible to the human viewer. Apart from being difficult to replicate, complex 3D computer imagery having low entropy or redundancy wiU require large amounts of storage space for a rogue attempt at duplication based on recording the screen output and therefore be more readily discernible to the user should this form of attack be mounted.
2. The animation is further preferably designed to thwart a successful replay attack which is based on providing only a subset (limited number of frames) ofthe screen animation to a viewer. This can be achieved, for example, by the inclusion of several animated spheres which "bounce" around the screen and change colours in a manner that is recognisable to the viewing user but which is not readily repeatable. A replay of only a subset of tiie screen animations to the viewer wiU be highly evident in
this case when, upon looping, the user is alerted to a problem when the animation "skips" or "jumps" and does not operate in a previously smooth manner. This makes it difficult for a rogue spoof program to copy the animation without including aU parts of it.
3. Most importantly, the graphics presented can be customised to the input data entered. For example, the information entered by a user can be rendered and or animated by the secure ID-Data entry program code 31 (Fig.3). As an example, in an ID-Data entry program, when a user types in their user name, the animation can be created letter by letter. For example, when typing in the user name "CHRIS" each letter could be rendered differently depending on those characters previously typed. For example, tiie letter "I" might appear as a large "barbers-pole" which spirals and changes colour, speed, size, and or position and is slightly transparent, thereby aUowing the animated seen which is a backdrop to the character to be discerned through the character itself. For example, in the above example, the letter "I" would only appear as the specific animated barbers pole that is does if tiie previous letters entered were "C", "H", and "R" respectively.
The utilisation of a unique sequence of animation based on a user's input of information sensitive data increases the difficulty of creating any "spoof program" attack on tiie apphcation 30. This is especiaUy the case since the executable code of appUcation 30 is preferably in an encrypted form. The use of animation being particular to the order in which characters are entered is particularly advantageous as the computational complexity of repUcation is substantially mcreased.
A similarly effective animation technique is to produce only one graphical object after entry of each portion of ID-Data, sudi as a computer-generated human's face, but have the features of said face be determined by a hash or cryptographic function based upon the users input. For example, after entry ofthe ID-Data "CHRIS" (in tiiis example, the individual characters may not, themselves, be based on the abovementioned generation procedure) , a teenage girl's face with long blonde hair and blue eyes may be displayed. If the "S" was instead a "D", the face would be entirely different. The ID-Data used for producing an object for display should not be ID-Data which is designed not to appear on-screen when entered (eg: a password), since the display of a conesponding object would give a rogue information on which to base guesses ofthe secret ID-Data.
By utilising cryptography or having complex formulas to determme the sequencing of animation, the rogue programming the coπesponding spoof program shall have to crack the cryptographic scheme in order to get the selection of character animation correct for any generalised attack. In the abovementioned example, a rogue will have to determine the algorithm for producing the face, since human beings are adept at recognising faces, and will immediately notice if the face displayed on the screen is incorrect. Such a technique allows for a mathematically secure, visual method to guarantee the authenticity ofthe software which generates the screen feedback. The user of the software is instructed to note their own particular animation sequence and to immediately discontinuing utilisation ofthe appUcation 30 should that sequence ever change. The user may also be instructed to contact a trusted person, such as the supplier or operator of tiie appUcation to confirm tiiat tiie animation sequence they witness is the authentic sequence intended by said supplier.
Further, the particular animation presented for a particular appUcation 30 can be further
customised for each appUcation so as to be distinct (such as by the incoφoration ofthe applications name as part ofthe animated image).
Further hindrance for a rogue programmer can be created by hand coding portions ofthe animation in assembly language so as to generate tiie maximum possible complexity and interaction in the animation with tiie highest level of detail for individual workstation computers. This further raises a hurdle aUowing for the easier detection of rogue spoof programs 22 which wiU often be written in a more convenient, higher level language (such as C or C++) which wiU also operate at a different speed, the user being instructed to look for speed differences.
Further, animated scene timing can be utilised, providing anti-looping and frame removal detection is stiU catered for. The animated scene timing aUows for a user to detect unexpected iπegularities in a frequently presented animated interface. By including in the animation some deUberate regularity (such as the rhythmic convergence of some parts ofthe animation in one particular spot), a rogue programming a spoof program shall also have to duphcate the preferably complex timing events necessary to accomplish tins convergence. The regular nature ofthe scene timing should be high enough so that tiie user expects to see certain events and thereby making it difficult for a rogue spoof program to copy the animation without including aU parts of it.
Preferably, where possible, aU ID-Data is immediately encrypted which makes recovery ofthe ID-Data by a rogue through analysis ofthe computer program memory difficult. Preferably, public- key cryptographic methods (eg: Elliptic-curve, RSA or Diffie-Hellman cryptography) should be used making it impossible to reverse engineer ti e cryptographic code to decrypt any sensitive mformation should rt be stolen in its encrypted form. Prohibiting all or most interrupts when data is to be entered and encrypting or hashing the sensitive information immediately so that it is only stored partially, or in an encrypted form, before re-enabling interrupts is one example of achieving this objective.
As a further alternative, analysis of a user's personal characteristics can be mcluded as part of the interface. This can include attempts at recognition of a user's typing style (duration of keypresses, delays between subsequent keys, choice of redundant keys, mouse usage characteristics, etc) or by additional authentication techniques, including smartcards, biometric inputs such as finger prints detectors etc.
Further, the graphical animation routines can be "watermarked" by the secure ID-Data entry program code in that "hidden" mformation may be incoφorated into the scene (for example "salted- checksums") to aUow careful analysis ofthe ouφut of secure ID-Data entry program code 31 to distinguish between original graphics animation and counterfeit animation. For example, the hidden information may be encoded in the least-significant bit of pixel data at selected locations ofthe animation.
The user determinable sequence of animation can also extend to tiie provided audio animation .
For example, audio and other feedback techniques including music and speaking tones can be played in response to particular key stroke combinations. By utilising different voices and/or tones and/or volumes and pitches for each keystroke or combination, the security ofthe appUcation 30 can, once again, be substantiaUy increased. The change in voice intonation will be readily "learnt" by a user and
thereby further inhibit a rogue's ability to duplicate the same sequence of sounds or voices . Of course, the encoding ofthe voice system should be in an encrypted form.
Further, upon detecting any attempt to subvert the secure ID-Data entry program code 31 (eg: subsequent to detecting tampering), a notification message is preferably sent to a prosecuting body or the like where the appUcation 30 is currently, or later becomes connected to a network such as the Intemet, or by other means (eg: via Modem or by including coded information in pubUc or other files).
For appUcation programs 30 requiring activation by a host program executed on a different computer, a secure means of activation can be incoφorated into the chart appUcation 30. The host and cUent intercommunication can issue chaUenge and response code authentication and verification utilising cryptographic systems such as public-key encryption and/or other standard means of overcoming data replay attacks and other threats designed to trick the secure client appUcation 30 into activation.
It would be appreciated by a person skiUed in the art that the process of coding any data entry process utiUsing these techniques, together with additional techniques to protect agamst recording, and eavesdroppmg, and executable protection techniques may be necessary to improve tiie security ofthe interface. AdditionaUy, executable encryption, additional authentication, and otiier methods are desirable in producing the protected executable.
It would be appreciated by a person skilled in the art that numerous combinations, variations and/or modifications may be made to the present invention as described without departing from the spirit or scope ofthe invention as broadly described. The present embodimarts are, therefore, to be considered in all respects to be illustrative and not restrictive.
Summary of he Applicator (of an improved process of security as hereinbefore described)
The prefeπed embodiment ofthe present inventions' method (hereinbefore described as the "apphcator") by which to apply an improved process of security (as hereinbefore described) will now be described with reference to the accompanying drawings .
Referring now to Fig.7, there is shown a standard format utilised for storing executables on disk, often occurring in the art, and in particular in conjunction wrth programs run on the above mentioned operating systems. The standard executable 16 normally comprises a header section 71, a code section 72, and a data section 73. The header section 71 normaUy stores a standard set of mformation required by the computer operatmg system 17 (Fig.l) for running ofthe executable 16. This can include relocation data, code size etc. The code section 72 is normally provided for storing the "algorithmic" portion ofthe code. The data section 73 normally is utilised to store the data, such as constants, or overlays 92 utilised by the code section 72.
Turning now to Fig.6, the prefeπed embodiment of an applicator program 60 is shown which takes as its input the executable program 16 and performs an obfuscating step 61, a ciphering step 62 and an anti-key press and authentication step 63 (described hereafter) which perform various transformations on the executable program 16 to produce a new executable program 30.
The obfuscating step 61 modifies the header 71 (Fig. 7) ofthe executable 16 in addition to inserting loading code which wiU be described hereinafter. The cipher step 62 encrypts tiie existing executable 16 and calculates check data (eg: a checksum) for the encrypted executable. The anti-key press and authentication step 63 replaces various insecure system caUs wrth safe equivalent code and preferably inserts code to graphicaUy represent the integrity of said executable program.
The newly formed executable 30 (new.exe) can be then stored on disk and the appUcator program 60 completed, tiie new executable 30 replacing tiie old executable program 16.
When it is desired to run tiie replacement executable program 30, the replaced executable 30 (new.exe) executes the obfuscating code, previously inserted by appUcator 60. The obfuscating code initiaUy decrypts the executable program and vaUdates the stored check-data before re-executing the decrypted executable.
The foregoing description ofthe prefeπed embodiment has been in general terms and it wiU be understood by those skiUed in tiie art that ti e invention has general appUcation to many different operating systems, includmg MS-DOS, Apple Macintosh OS, OS/2, Unix etc.
The most common operating system utilised today is tiie MS-DOS operating svstem. This operating system is designed to run on INTEL x86 microprocessors and mcludes a large number of historical "quirks" which give rise to greater complexity than would perhaps be otherwise required when designing a new operating system from "scratch". For illustrative purposes, there will now be presented a specific embodiment ofthe preferred embodiment designed to operate under tiie MS-DOS operatmg system. Unfortunately, the example is quite complex as it operates in the framework ofthe MS-DOS operating system. Therefore, it is assumed that the reader is familiar wrth systems programming under the MS-DOS operatmg system. For an extensive explanation ofthe inner workings ofthe MS-DOS operating system, reference is made to standard texts in this field. For example, reference is made to "PC Intern" by Michael Tischer, published in 1994 by Abacus, 5370 52nd Street, S.E. Grand Rapids, MI 49512. A second useful text in this matter is "PC Architecture and Assembly Language" by Barry Cauler, published 1993 by Carda Prints, 22 Regatta Drive, Edgewater, WA 6027, Australia.
The specific embodiment ofthe present mvention will be described with reference to altering an "EXE" executable program under DOS in accordance wrth tiie principles ofthe present invention.
Referring now to Fig.9, there is shown the structure 90 of an executable ".EXE" program in
MS-DOS as normally stored on disk. This structure is closely related to the structure 16 of Fig. 7 which iUustrates the more general case. The structure 90 mcludes a header 71, otherwise known in MS-DOS terminology as the program segment prefix (PSP). This is normally foUowed by a relocation table 91 which contains a Ust of pointers to variables within a code area 72 which must be updated wrth an offset address when the program is loaded into a particular area of memory. The operation ofthe relocation table is weU known to those skiUed in the art of systems prograrnming. The next portion of structure 90 is tiie code area 72 which contains the machine instructions for operation on tiie x86 microprocessor. This is foUowed by a program data area 73 which contams the data for code area 72. Finally, there may exist a number of overlays 92 which contain code which can be
utihsed in a known manner.
Referring now to Fig.8, there is shown the structure of EXE file header 71 in more detail. The table of Fig.8 being reproduced from page 750 ofthe above mentioned Tischer reference. It should be noted that the header 71 mcludes a number of fields including, for example, a pointer 81 to the start of the code 72 (Fig. 7) and a pointer 82 to the relocation table 91 (Fig.9).
In the specific embodiment, the appUcator program 60 (Fig. 6) proceeds by means ofthe foUowing steps:
(1) The executable program 16 is opened for reading and a determination made of its size.
(2) The header 71 (Fig.9) of executable program 16 is then read in and a copy is stored within appUcator program 60. A copy ofthe header 71 is written out to form part 101 ofthe new.exe file 30 as Ulustrated in Fig.10.
(3) Next, from the fields 81, 82 ofthe header 71 (Fig. 8) a determination is made ofthe size of relocation table 91 of executable program 16.
(4) Next, determination is made ofthe size ofthe executable code 72 and data portions 73. (5) The relocation table 91 is then read into the memory of tiie appUcator program 60. As noted previously, tiie relocation table 91 consists of a series ofthe pointers to positions within code segment 72 which are required to be updated when loading the program.exe file into memory for execution. The relocation table is sorted 93 by address before being written out to the new.exe executable file at position 102. (6) As noted previously, the relocation table 91 consists of a series of pointers into code area
72. A determination is made ofthe size of a code, known as the "netsafe 1" code 104, the contents of this code will be described hereinafter. Next, a search is conducted ofthe sorted relocation table 102 to find an area between two consecutive pointers within code section 72 which is of greater magnitude than the size of netsafe 1 code 104. This area 94, designated part B in Fig.9 is located. If this code portioned 94 cannot be located the appUcator program 60 exists wrth an eπor condition.
Upon finding code portion 94, the code portion 95, also denoted part A is encrypted and copied across to form new code portion 103. Code portion 94 is then encrypted and copied to an area 105 of new.exe 30. The netsafe 1 code 104 is then inserted by appUcator 60. Code portion 96, also denoted part C is encrypted and copied across to form code portion 106. Data portion 73 and overlay portion 92 are copied into new.exe 30 as shown. A second portion of obfuscating code, denoted "netsafe 2" 107, the contents of which will be described hereinafter, is then inserted after overlays 92 and before code portion part B 105.
(7) The header 101 is then updated to reflect the altered layout ofnew.exe executable 30. AdditionaUy, the initial address 109 of execution stored in header 101 is altered to be the start of netsafe 1 portion 104.
(8) As mentioned before, code portions 103, 106 and 105 are subjected to encryption or encipherment in accordance wrth step 62 of Fig.6. The encryption scheme utilised can be subjected to substantial variation. In this embodiment, the DES standard encryption scheme was utilised This scheme reUes on a fifty-six bit key for encryption and decryption and is weU known in tiie art.
Once encrypted, it is necessary to store the decryption key in new.exe executable 30. A number of different methods can be utilised to store the key. The prefeπed method is to spread portions ofthe key to different positions within the executable 30. For example, bits ofthe key can be stored within the netsafe 1 code 104 and netsafe 2 code 107. AdditionaUy, bits ofthe key can be stored within header portion 101. Also, it is envisaged that bits ofthe key can be stored in tiie condition codes which are a consequence of execution of various instructions within netsafe 1 area 104 and netsafe 2 area 107 and/or tiie operating system 17 (Fig.5), with the overaU requirement being that the key can be later extracted using a predetermmed algorithm.
(9) The next step is to patch tiie address ofthe start of code area 72 and netsafe 2 code area 107 into the required locations within netsafe 1 area 104.
The netsafe 1 area is then written to the file containing new.exe executable 30.
(10) The area 106 is then encrypted as aforementioned and written to the executable 30 foUowed by overlays 92 and encrypted netsafe 2 code portion 107.
(11) As wiU become apparent hereinafter, upon execution ofnew.exe executable 30, netsafe 2 area 107 is responsible for loading code portion 105 over tiie top of netsafe 1 area 104. Therefore, it is necessary to write the relevant addresses ofthe start and end of code portion 94 to the required position within netsafe 2 area 107.
(12) As wiU be described hereinafter, netsafe 2 area 107 is also responsible for decrypting the encrypted portions of codes 103, 104, 105, 106, and 107 and hence tiie netsafe 2 area 107 must also store this combined code size for later use on decryption .
Finally, a overall checksum for new.exe 30 is calculated and stored at the end ofthe file at position 108. This checksum is later used to verify the decryption procedures ' success and to prevent the execution of "scrambled" code, which would be the result ifnew.exe 30 were tampered wrth.
As will be further described hereinafter, netsafe code areas 104 and 107 contain code to decrypt the encrypted areas ofthe new.exe 30, to repatch code portion 105 back to its original position, and to replace potentiaUy insecure routines or easϋy spoofed screens normaUy utilised by the apphcation (eg: unsafe keyboard drivers) wrth an altemative safe form of routine.
Upon execution of tiie new.exe executable 30, the executable starts at the start of netsafe 1, area 104 (Fig.11), as this address has been previously patched into position 109 (fig.10) of header 101 (Fig.10). The netsafe 1 area 104 then performs the followmg steps (Al) to (AIO):
(Al) The first step is to disable aU the interrupts apart from those necessary for continued operation ofthe computer device 18 (Fig. 1) (for example, memory refresh cannot be disabled). The disabling of interrupts mcludes the disabling ofthe keyboard interrupt in order to stop amateur "code snoopers" from determinmg the operation ofthe code area 104.
(A2) The next step is to inteπogate the calling environment of tie operatmg system stack to ensure tiiat the program new.exe was not caUed by a debugging program which is tracmg the
operation ofnew.exe. Additionally, the data variables necessary for operation of netsafe 1 code area
104 are defmed to be on the operating system stack (Refer Address OEH and 10H in Fig.8). This stack will change unexpectedly when in a code snooping or debugging environment and will cause the debugger to crash, thereby stopping a it from foUowing the operation ofnew.exe executable 30.
(A4) The interrupt trap addresses are then altered in a two stage process. The first stage resets a first part ofthe SEG:OFF address format and occurs at this point with a second stage occurring at a later time as will be further described herein below. By staging the alteration of interrupt trap addresses, any code snooper will be further confused as said trap addresses wiU initiaUy be garbage.
(A5) Any input from tiie keyboard is further disabled by infoπning the MS-DOS operating system to ignore any received keys.
(A6) The second stage ofthe revectoring ofthe normal debugging inteπupts is then applied so that the normal debugging interrupts can be used by the decryption code, to be described hereinafter, thereby making debugging almost impossible.
(A7) A check is then made to ensure that the above processes have been successful in that the debugger interrupts do not point to any debuggers, the keyboard is stiU disabled and the operatmg system has disabled the acceptance of keys from the keyboard.
(A8) The key for decryption is then reconstructed utilising the reverse process to that utilised in storing the mformation located in the key.
(A9) Turning now to Fig. 11, there is shown the standard format ofthe executable new.exe 30 when executing in memory. As will be well known to those skiUed in the art, an executing program under the MS-DOS system will include a stack 111 and work space 112. A memory allocation (MaUoc) call is then done to set aside an area 113 for the loading in of tiie netsafe 2 code 107 of Fig.10. The disk copy ofnew.exe 30 (having the format shown in Fig.10) is tiien opened by the netsafe 1 code 115 and an encrypted copy of netsafe 2 code 107 (Fig.10) is then loaded in from the disk file, decrypted and stored in memory area 113. The relocatable pointers of he code contained within the netsafe 2 code 113 are then updated to reflect tiie position ofthe executable in memory.
(AIO) Control is then passed to netsafe 2 code 113.
The code area netsafe 2, 113 then performs the followmg steps (Bl) to (B4):
(Bl) The portion of code ofthe disk copy denoted part B, 105 (Fig.10) is read in from disk in an encrypted format and written over the old netsafe 1 code 115.
(B2) As will be further described hereinafter, the netsafe 2 area 113 includes a number of keyboard routines which are preferably stored in an encrypted format. Therefore, the next step is to apply the decryption to any ofthe encrypted areas of netsafe 2 code area 113. After decryption, the netsafe 2 area 113 is checksummed and the result is tested against a prestored checksum to ensure tiie integrity of netsafe 2 area 113.
(B3) The disk copy ofthe new.exe is then again read in and checked agamst prestored check data to ensure that it has not been changed. Additionally, an attempt is made to read past the end of file ofthe disk copy ofnew.exe 30 (Fig.10) to ensure that no extension (eg: viral) has occuπed.
(B4) The encrypted portions ofthe memory copy (Fig.11) ofnew.exe are then decrypted utilising the key and once decrypted, the decrypted portions are again checked and tested against predetermmed data.
The next step in execution of he netsafe 2 code 113, is to replace insecure (eg: keyboard) system routines with a more secure method. Referring now to Fig.12, tiiere is shown the current state ofthe new.exe executable in memory. The insertion of he more secure system routines tiien proceeds in accordance with the foUowing steps (C 1 ) to (C5) :
(Cl) Firstly, a second memory allocation is done to set aside an area 51 (Fig. 13) for the storing ofthe secure hardware routines (eg: keyboard). These routines are then copied from their area within netsafe 2 code 113 to the memory area 51.
(C2) Next, the ID-Data entry routines which are ncirmaUy activated bythe interrupt table 131 when dealing wrth ID-Data input are altered such that, rather than pointing to correspondmg areas of tiie MS-DOS operating system 17, they point to the corresponding secure area 51. These interrupts mclude interrupt 9 which occurs when a key is pressed on a keyboard, interrupt 29h which reads a key and interrupt 16h which tests for the presence of a key.
(C3) The executable 30 (Fig.13) is then ready for execution and the registers are initialised, the memory area 113 deaUocated & control passes to the original start address of executable program 16.
(C4) It wiU be evident, that when executing, all keyboard calls (or otiier ID-Data entry caUs, if other than keyboard) will be passed to keyboard (or other) routines 51 wrth the keyboard hardware being inteπogated directly by keyboard routines 51 to retum mformation to the calling program. Keyboard routines 51 mclude a copy ofthe coπect interrupt vector addresses for each keyboard routine and each time they are called, a check is made ofthe interrupt table to ensure that it has not been altered. Preferably, keyboard routines 51 protect the keyboard hardware by issuing controller reset or similar commands to flush the keyboard data out ofthe circuitry after said data is retrieved to prevent hardware eavesdroppmg, or routines 51 utilise the protected mechanisms ofthe central processor to protect said hardware from eavesdroppmg.
(C5) When the executable 30 terminates, interrupt 21h (an MS-DOS standard) is called This interrupt is also revectored to a coπesponding area of routines 51. The termination code of keyboard routine area 51 restores the correct interrupt pointers in interrupt table 131 to point to the MS-DOS operating system 17, and clears the no-longer-needed program and data from memory before returning to the DOS operatmg system by calling the real interrupt 21.
The foregoing describes only one particular embodiment ofthe present mvention, particularly to the operation ofthe MS-DOS operating system, ft will be evident to those skiUed in the art, tiiat the principles outlined in the particular embodiment can be equaUy appUed to otiier operating systems in
accordance wrth the objects ofthe present invention. Further, modifications, obvious to those skilled in the art, can be made thereto without departing from the scope ofthe invention.
EXPLANATION AND PURPOSE OF CLAIMS
Claims 1,2, and 3 are independent. The invention in claim 1 covers any high security software protecting ID-Data by utihsing anti-spy techniques, and tamper-protecting itself. Claim 2 is for a metiiod of producing high security software, such as, but not Umited to, that in claim 1. Claim 3 is for a new process of graphically representing the authenticity of high security software, such as, but not Umited to, that in claim 1 or produced by claim. 2.
Claims 4, 5, 6, 7, 8, and 9 add prefeπed components to the high-security enforcing functions of the software in claim 1. Claim 10 adds a tracing-prevention prefeπed component to claim. 9
Claims 11, 12, 13, 14, 15, 16, 50, and 53 add prefeπed components to the security-applicator method of claim 2.
Claims 17 to 49 inclusive and claims 51 & 52 outlines the specific area of protection that this invention affords a computer program acting as a user interface (eg: ID-Data entry screen). Specifically, they specifies how this invention appUes in the areas of protecting an interface against counterfeiting (i.e.: hampering the possibility that a fake copy of said interface can be successfully presented to a user to fool said user into entering information into the fake interface), and protecting an interface against malicious (or otherwise) tampering, examination, emulation, and eavesdropping.
Claims
1. A high security executable program comprising:
(a) puφose-written computer input routines within or accessed by software on a computer system for the entry of JJ -Data (as hereinbefore defmed), and (b) anti-spy techniques (as hereinbefore defmed) within said input routines which prevent or hamper eavesdropping (as hereinbefore defined) on said ID-Data, and (c) tamper-detection techniques (as hereinbefore defined) within or accessed by said software to detect tampering (as hereinbefore defmed) and techniques which, upon detection of tampering, either disallow tiie subsequent entry of ID-Data into said input routines, or which invaUdate said ID-Data in order to diεaUow current and subsequent access to that which said ID-Data would have otherwise aUowed
2. A method of altering an original executable program to form an altered executable program having increased security, said method comprising the steps of:
(a) inserting obfuscating code into a first number of predeteπnined areas of said executable program; and
(b) encrypting portions of said executable program for later decryption upon execution; such that, upon execution of said altered executable program, said execution includes the steps of:
(c) decrypting tiie altered executable program; and
(d) restoring said altered executable program to said original executable program. 3. A method of providing for a secure entry of input information in a computer system comprising:
(a) activating a visual display or animation and/or audio feedback (hereinafter called an audiovisual component) as part of said secure entry of input information so as to hamper emulation of said secure entry process; and (b) audio/visual component feedback of two or more of:
(c) all or part of said input information;
(d) all or part of information based upon some transformation of said input information;
(e) aU or part of εome transformation of aU or part ofthe software comprising said audio/visual component and/or the computer operating system upon which said audio/visual component operates.
4. A method as claimed in claim 1 additionaUy includmg the replacement of code which is vulnerable to eavesdropping (as herembefore defined) with equivalent code which removes said vulnerability; said equivalent code which communicates directly wrth the hardware ofthe computer while disabling system interrupts or other functions which would permit rogue software (as herembefore defmed) to eavesdrop .
5. A metiiod as claimed in claim 1 additionally includmg one or more automatic disassembly (as hereinbefore defined) techniques of (a) obfuscating inserts (as hereinbefore defined),
(b) dummy instructions (as hereinbefore defmed), or (c) executable encryption (as herembefore defmed). 6. A method as claimed in claim 1 additionaUy including code to detect tampering (as herembefore defined) by re-reading its own external-image or its internal memory image and
comparing said image or a calculated check of said image wrth pre-calculated check-data or known identical equivalents.
7. A method as claimed in claim 1 additionally including code to automaticaUy memory- scan the said software one or more times before or during execution of said software to detect tampering (as hereinbefore defmed).
8. A method as claimed in claim 1 additionaUy mcludmg code to store or communicate details of detected tampering for later examination, said detaϋs includmg aU or part of said tampered software, and or other information available to said tampered software from said computer system.
9. A method as claimed in claim 1 additionaUy including code to prevent, or detect and subsequently prevent tracing, or mislead code debuggers and execution tracing by utilising debugger trap facilities for the normal operation of said security-enhanced software, and/or monitoring system timers or including tuning-sensitive instructions or monitoring CPU stack contents or monrtoring system buffers to detect the activrty of code debuggers, and/or disabling facilities mcludmg the keyboard, serial ports, printer ports, mouse, screen or system interrupts in order to hamper code debuggers, and or testing that tiie disabled status is still true of said facilities to detect code debuggers, and/or utilising system inteπurrts wrύch would ordinarily be used by code clebuggers fo the custom purposes of said security-enhanced software, and/or utiUsing CPU instruction caches together with self-modifying code to mislead code debuggers, and or scanning or inteπogating the operatmg system or executable-load-process to detect code debugger instructions or environments. 10. A method as claimed in claim 9 additionally including a process or multiple processes which are resident or child processes of said security-enhanced software which execute during system interrupts or after the parent process has terminated in order to hamper tracing.
11. A method as claimed in claim 2 wherein said obfuscating code includes replacement codes for insecure system routines and said execution further mcludes the step of : (e) replacing the execution of said insecure system routines wrth said replacement codes.
12. A method as clauned ui claim 2 wherein said steps (c) and (d) occur while simultaneously substantially disabling eavesdroppmg on the operation of said steps (c) and (d) by any rogue program. 13. A metiiod as claimed in claim 2 wherem said step (a) includes inserting a portion of said obfuscating code into the code area of said original executable program.
14. A method as claimed in claim 11 wherein said step (e) mcludes altering portions of an interrupt vector table to point to said replacement codes.
15. A metiiod as claimed in claim 2 wherein said step (b) includes the storing of a decryption key in a plurality of predetennined areas of said altered executable program.
16. A method as claimed in claim 15 wherem said predetermined areas mclude the condition codes of predetermined instructions of said altered executable program.
17 A method as clauned in claim 3 wherem said audiovisual component has repeatable characteπstics durmg subsequent invocations of said entry process, such that said audiovisual component on each invocation of said entry process has a predetermmed resemblance to the audiovisual component of aU other invocations of said entry process 18 A metiiod as claimed m claim 3 wherem said audiovisual component is vaned m accordance wrth the mformation entered
19 A metiiod as claimed m claim 3 wherem said audiovisual component compnses ovmg parts and/or mcludes 2 5-dιmensιonal animation or 3-dιmensιonal animation
20 A method as claimed m claim 3 wherem said audiovisual component mcludes a representation of said mput -nformation
21 A method as claimed m claun 20 wherem said mput mformation representation compnses (a) display of a smgle graphical object and/or (b) production of a smgle audio-feedback sequence, after the entry of all or part of said mput mformation
22 A method as clauned m claun 20 wherem said mput information representation mcludes animation of mput characters and/or audible or other feedback determmed by mput characters
23 A method as clauned in claim 22 wherem the representation of said mput characters vanes for each character based on the result of a predetermmed transformation ofthe preceding imputed characters 24 A method as claimed m claim 23 wherem said transformation utilises cryptographic or hashing methods
25 A method as claimed m claim 3 wherem the ease by which faithful replication of said audiovisual component is substantially reduced by mclusion m said audiovisual component the techmques of on screen shadow rendering and/or spot or flood scene fighting effects and or scene or object shadmg and/or transparent or translucent objects and/or shmy, reflective, or miπored objects and or real-time animation roughly obeying real world gravitational effects and/or smgle-image- random-dot-stereogram bitmaps or backdrops and/or partial scene masking effects and/or full or partial scene distortion or diffraction effects and/or animated objects designed to resist simple hidden-surface removal techmques and/or animated bitmaps and/or audible echo effects and/or differmg audio voice effects and/or differing audio volume and/or differmg audio tones or pitches
26 A method as claimed in claim 3 wherem said audiovisual component is immediately recognisable to human bemgs and mcludes information which identifies to the user the apphcation to which said audiovisual component belongs
27 A method as clauned in claim 3 wherem the ease by which faithful rephcation of said audiovisual component is further reduced by mclusion m said audiovisual components animation object movement timing such that at near regular and frequent intervals regularities occur which are obviously recognisable to users of said entry process
28 A method as claimed m claim 3 wherem said entry process mcludmg said audiovisual component utilises a substantial portion ofthe computational resources of said computer system
29 A method as claimed m claun 3 wherein said entry process code responsible for said audiovisual component is coded m the assembly language ofthe computer system
30 A method as clauned m claim 3 wherem recording said audiovisual component by said computer system is disabled 31 A method as claimed m claun 3 wherem (a) the facility to suspend or swap-out said entry process is either disabled, or (b) immediately upon suspension request, said entry process is protected agamst subsequent examination by encryption or by termination and removal from memory of said entry process, or (c) where the facility to allow the central processor or processors of said computer system to execute code other than the code of, or the code necessary for said entry process is either disabled or else said entry process is protected agamst examination
32 A method as claimed m claun 3 wherem said entry process hampers simple recording by utihsmg the maximum practicable use of audiovisual framerate, and or audiovisual resolution, and or screen colours, and or audiovisual design in said audiovisual component on said computer system 33 A method as claimed m claim 3 wherem said entry process hampers tiie compression of recorded output from said audiovisual component by utihsmg high audiovisual entropy and or by the mclusion of random or other noise m said audiovisual component
34 A method as claimed m claim 3 wherem said audiovisual component mcludes continuous ouφut such that the looping of only a subset of said output shall not reproduce a copy largely indistinguishable to said audiovisual component
35 A method as claimed m claun 1 or claun 3 wherem said ID-Data or said mput mformation is encrypted wrth some cryptographic process or hashed immediately upon entry and a plam text equivalent is not stored by said computer system
36 A method as claimed m claim 35 wherem disablement of one or more interrupt instructions (or equivalent CPU devices) is utilised to protect said cryptographic or said hash process of said ID-Data to hamper the recovery of said ID-Data by processes other than said entry process
37 A method as claimed m claun 1 or claim 3 wherem said mput routines or said secure entry process prevents the re-vectoπng of system interrupts m order to protect said ID-Data or said mput mformation from bemg stolen, by means of re-applymg mterrupt vector pointers one or more times andor by means of examining interrupt assignments in order to perform a predetermmed function should the expected assignments be altered
38 A method as claimed m claim 1 or claim 3 wherem m order to further authenticate and/or identify said user, additional aspects of said ID-Data or said mput information are used mcludmg the duration of mdividual key presses and or mouse button presses and/or the delay between subsequent mdividual key presses or mouse button presses and/or the user's selection of particular keys when more than one equivalent exists and or the acceleration or velocity charactenstics of mouse usage and/or where said mput mformation mcludes information from other sources mcludmg biometric and/or smartcard mformation
39. A metiiod as claimed in claim 1 or claim 3 wherem said input routines or said secure entry process authenticates itself using (a) executable code checksums of RAM or other images of its own executable code and/or data, (b) and or comparison of memory wrth otiier stored copies of said executable code, (c) and or decryption of said entry process (d) and or detection of executable tampering by examination o the executable's environment (e) and or comparison of executable size wrth expected values (f) and or by attempting to read past tiie end of tiie executable file to determme that the size is correct; parts (a) through (f) occurring eitiier upon initial load or during or after execution one or more times or continually during execution.
40. A method as claimed in claim 1 or claim 3 wherem said input routines or said secure entry process makes use of system interrupts to monitor irsetfm order to detect alteration of itself
41. A method as claimed in claim 39 or claim 40 wherem said input routines or said secure entry process incorporates means by which to notify and/or transmrt authentication failure details to a third person or process should said self authentication fail.
42. A method as claimed in claim 1 or claim 3 wherem said input routines or said secure entry process records a log ofthe usage and or details of tbe user of said input routines or said secure entry process.
43. A method as claimed in claim 1 or claim 3 wherem said input routines or said secure entry process incoφorates warnings within the executable image indicatmg that examination and/or tampering is prohibited. 44. A method as claimed in claim 3 wherein said audiovisual component contains watermark information mcoφorated into the scene to aUow close inspection of said audiovisual component to distinguish between tiie genuine process and a close replica.
45. A method as claimed in claim 1 or claim 3 wherein said input routines or said secure entry process's loading and/or decryption routines are stored within tiie executable image in such a way as they inrtially replace other entry process routines and upon successful decryption and/or authentication, said other entry process routines are replaced.
46. A method as claimed in claim 1 or claim 3 wherem said input routines or said secure entry process hampers executable-code tracing through control-flow changes in debug environments or through disabling one or more system interrupts and/or disabling the keyboard and/or disabling the mouse or other input devices and/or making use of tiie program stack pointer to discern existence of a debug environment and/or utilising debug interrupts for program code operation and or self- modification of executable code and or examination of CPU flag registers and/or verification of disabled interrupts still-disabled state and/or verification of disabled keyboards stiU-disabled state and/or loading additional executable code into memory during execution. 47. A method as claimed in claim 1 or claim 3 wherem tiie executable image of said input routines or said secure entry process includes obfuscating assembly language dummy operation codes or instruction prefixes inserted after one or more unconditional branches to hamper executable disassembly and or decompilation and/or reverse engineering.
48. A method as claimed in claim 1 or claim 3 wherein said input routines or εaid secure entry process is securely activated by As activation process and/or a host or server computer using a challenge/response activation protocol or using public or private key cryptographic methods.
49. A method as claimed in claim 1 or claim 3 wherein said input routines or said secure entry process is stored outside of said computer system memory in encrypted form and/or where said entry process employs techniques to hinder executable-code tracing and/or executable-code disassembly or disclosure or decompilation and/or executable-code tampering and/or executable- code hot-patching and/or reverse-engineering and/or pre, in, or post-execution executable-code recording, cop>ing, eavesdropping or retrieval and/or theft of said input information from keyboard hardware or software or drivers .
50. A method as claimed in claim 2, 11, 12, 13, 14, 15, or 16 further comprising the insertion of one or more components as claimed in claims 1, 4, 5, 6, 7, 8, 9, 10, or 51.
51. A process as claimed in claim 3, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, or 49 further comprising protecting all or part of said input routines or said secure entry process with zero or more components as claimed in claims 1, 4, 5, 6, 7, 8, 9, 10, or 0.
52. A method for providing for the secure input of information into a computer syεtem, or A high security executable, substantiaUy as hereinbefore described with reference to the accompanying drawings.
53. A method of altering an original executable program to form an altered executable program having increased security, substantially as hereinbefore described with reference to the accompanying drawings.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AUPN4186 | 1995-07-14 | ||
AUPN4186A AUPN418695A0 (en) | 1995-07-14 | 1995-07-14 | Computer security system |
AUPN9866A AUPN986696A0 (en) | 1996-05-15 | 1996-05-15 | Interface authentication system |
AUPN9866 | 1996-05-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1997004394A1 true WO1997004394A1 (en) | 1997-02-06 |
Family
ID=25644994
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/AU1996/000440 WO1997004394A1 (en) | 1995-07-14 | 1996-07-12 | Computer software authentication, protection, and security system |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO1997004394A1 (en) |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999001815A1 (en) * | 1997-06-09 | 1999-01-14 | Intertrust, Incorporated | Obfuscation techniques for enhancing software security |
EP0949574A2 (en) * | 1998-03-25 | 1999-10-13 | Arachnid, Incorporated | Customizable multimedia segment structures |
WO1999056196A1 (en) * | 1998-04-30 | 1999-11-04 | Bindview Development Corporation | Computer security |
EP1000482A1 (en) * | 1997-08-06 | 2000-05-17 | Intel Corporation | Cell array providing non-persistent secret storage through a mutation cycle |
EP1010291A1 (en) * | 1997-09-05 | 2000-06-21 | Intel Corporation | A tamper resistant player for scrambled contents |
EP1018236A1 (en) * | 1997-09-05 | 2000-07-12 | Intel Corporation | Tamper resistant methods and apparatus |
EP1020049A1 (en) * | 1997-09-05 | 2000-07-19 | Intel Corporation | Tamper resistant methods and apparatus |
WO2000065442A1 (en) * | 1999-04-23 | 2000-11-02 | Giesecke & Devrient Gmbh | Protection of the core part of a computer against external manipulation |
WO2000065444A1 (en) * | 1999-04-28 | 2000-11-02 | Thomas Probert | Techniques for encoding information in computer code |
WO2000072112A2 (en) * | 1999-05-12 | 2000-11-30 | Fraunhofer Crcg, Inc. | Obfuscation of executable code |
WO2000077597A1 (en) * | 1999-06-09 | 2000-12-21 | Cloakware Corporation | Tamper resistant software encoding |
US6237137B1 (en) | 1997-10-15 | 2001-05-22 | Dell Usa, L.P. | Method and system for preventing unauthorized access to a computer program |
US6256737B1 (en) | 1999-03-09 | 2001-07-03 | Bionetrix Systems Corporation | System, method and computer program product for allowing access to enterprise resources using biometric devices |
US6334189B1 (en) | 1997-12-05 | 2001-12-25 | Jamama, Llc | Use of pseudocode to protect software from unauthorized use |
WO2001099034A2 (en) * | 2000-06-21 | 2001-12-27 | Aladdin Knowledge Systems; Ltd. | System for obfuscating computer code to prevent disassembly |
US6480959B1 (en) | 1997-12-05 | 2002-11-12 | Jamama, Llc | Software system and associated methods for controlling the use of computer programs |
US6643775B1 (en) | 1997-12-05 | 2003-11-04 | Jamama, Llc | Use of code obfuscation to inhibit generation of non-use-restricted versions of copy protected software applications |
GB2391341A (en) * | 2002-07-31 | 2004-02-04 | Hewlett Packard Co | A method of validating the rights of a user to participate in an interactive computer environment |
US6728219B1 (en) | 1999-11-15 | 2004-04-27 | Networks Associates Technology, Inc. | Graphical user interface system and method for visually gauging network performance |
US6779114B1 (en) | 1999-08-19 | 2004-08-17 | Cloakware Corporation | Tamper resistant software-control flow encoding |
US7140005B2 (en) * | 1998-12-21 | 2006-11-21 | Intel Corporation | Method and apparatus to test an instruction sequence |
US7240363B1 (en) * | 1999-10-06 | 2007-07-03 | Ellingson Robert E | System and method for thwarting identity theft and other identity misrepresentations |
US7260845B2 (en) * | 2001-01-09 | 2007-08-21 | Gabriel Kedma | Sensor for detecting and eliminating inter-process memory breaches in multitasking operating systems |
US7383569B1 (en) | 1998-03-02 | 2008-06-03 | Computer Associates Think, Inc. | Method and agent for the protection against the unauthorized use of computer resources |
WO2007147495A3 (en) * | 2006-06-21 | 2008-08-28 | Wibu Systems Ag | Method and system for intrusion detection |
US7421586B2 (en) | 1999-05-12 | 2008-09-02 | Fraunhofer Gesselschaft | Protecting mobile code against malicious hosts |
US7770016B2 (en) | 1999-07-29 | 2010-08-03 | Intertrust Technologies Corporation | Systems and methods for watermarking software and other media |
US7877613B2 (en) | 2002-09-04 | 2011-01-25 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Protecting mobile code against malicious hosts |
US7950048B2 (en) * | 2000-03-29 | 2011-05-24 | Microsoft Corporation | Methods and arrangements for limiting access to computer controlled functions and devices |
US8016189B2 (en) | 1996-12-04 | 2011-09-13 | Otomaku Properties Ltd., L.L.C. | Electronic transaction systems and methods therefor |
US8136148B1 (en) | 2008-04-09 | 2012-03-13 | Bank Of America Corporation | Reusable authentication experience tool |
FR2986124A1 (en) * | 2012-01-25 | 2013-07-26 | Ercom Engineering Reseaux Comm | METHOD FOR AUTHENTICATING A DEVICE COMPRISING A PROCESSOR AND A CHIP CARD BY GENERATING A PATTERN |
US9009798B2 (en) | 2000-03-23 | 2015-04-14 | Citibank, N.A. | System, method and computer program product for providing unified authentication services for online applications |
US9398013B2 (en) | 1999-03-09 | 2016-07-19 | Citibank, N.A. | System, method and computer program product for an authentication management infrastructure |
EP3185194A1 (en) * | 2015-12-24 | 2017-06-28 | Gemalto Sa | Method and system for enhancing the security of a transaction |
US9843447B1 (en) | 1999-09-09 | 2017-12-12 | Secure Axcess Llc | Authenticating electronic content |
US10237073B2 (en) | 2015-01-19 | 2019-03-19 | InAuth, Inc. | Systems and methods for trusted path secure communication |
CN110162937A (en) * | 2018-02-09 | 2019-08-23 | 黄冈职业技术学院 | The method for realizing protecting computer software based on network communication |
CN113343234A (en) * | 2021-06-10 | 2021-09-03 | 支付宝(杭州)信息技术有限公司 | Method and device for carrying out credible check on code security |
US11880832B2 (en) | 2015-12-24 | 2024-01-23 | Thales Dis France Sas | Method and system for enhancing the security of a transaction |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2600184A1 (en) * | 1986-06-17 | 1987-12-18 | Vo Quang Tuyen | Method of protecting a computer program and electronic key for implementing this method |
WO1988003287A1 (en) * | 1986-10-24 | 1988-05-05 | Harcom Security Systems Corporation | Computer security system |
EP0326700A2 (en) * | 1988-02-01 | 1989-08-09 | International Business Machines Corporation | A trusted path mechanism for virtual terminal environments |
US4864494A (en) * | 1986-03-21 | 1989-09-05 | Computerized Data Ssytems For Mfg., Inc. | Software usage authorization system with key for decrypting/re-encrypting/re-transmitting moving target security codes from protected software |
WO1992014209A1 (en) * | 1991-02-05 | 1992-08-20 | Toven Technologies Inc. | Encryption apparatus for computer device |
EP0568438A1 (en) * | 1992-04-27 | 1993-11-03 | Gemplus Card International | Method for securing of executable programs against utilisation by an unauthorized person and security system for its application |
-
1996
- 1996-07-12 WO PCT/AU1996/000440 patent/WO1997004394A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4864494A (en) * | 1986-03-21 | 1989-09-05 | Computerized Data Ssytems For Mfg., Inc. | Software usage authorization system with key for decrypting/re-encrypting/re-transmitting moving target security codes from protected software |
FR2600184A1 (en) * | 1986-06-17 | 1987-12-18 | Vo Quang Tuyen | Method of protecting a computer program and electronic key for implementing this method |
WO1988003287A1 (en) * | 1986-10-24 | 1988-05-05 | Harcom Security Systems Corporation | Computer security system |
EP0326700A2 (en) * | 1988-02-01 | 1989-08-09 | International Business Machines Corporation | A trusted path mechanism for virtual terminal environments |
WO1992014209A1 (en) * | 1991-02-05 | 1992-08-20 | Toven Technologies Inc. | Encryption apparatus for computer device |
EP0568438A1 (en) * | 1992-04-27 | 1993-11-03 | Gemplus Card International | Method for securing of executable programs against utilisation by an unauthorized person and security system for its application |
Non-Patent Citations (2)
Title |
---|
IEEE COMPUTER, Vol. 28, No. 1, January 1995, LOMAS et al., "To Whom am I Speaking", pp. 50-54. * |
IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, Vol. 21, No. 3, March 1995, ILGUN et al., "State Transition Analysis: A Rule-Based Intrusion Detection Approach", pp. 181-199. * |
Cited By (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8016189B2 (en) | 1996-12-04 | 2011-09-13 | Otomaku Properties Ltd., L.L.C. | Electronic transaction systems and methods therefor |
US8225089B2 (en) | 1996-12-04 | 2012-07-17 | Otomaku Properties Ltd., L.L.C. | Electronic transaction systems utilizing a PEAD and a private key |
US6668325B1 (en) | 1997-06-09 | 2003-12-23 | Intertrust Technologies | Obfuscation techniques for enhancing software security |
WO1999001815A1 (en) * | 1997-06-09 | 1999-01-14 | Intertrust, Incorporated | Obfuscation techniques for enhancing software security |
EP1000482A4 (en) * | 1997-08-06 | 2001-11-14 | Intel Corp | Cell array providing non-persistent secret storage through a mutation cycle |
EP1000482A1 (en) * | 1997-08-06 | 2000-05-17 | Intel Corporation | Cell array providing non-persistent secret storage through a mutation cycle |
EP2131524A3 (en) * | 1997-09-05 | 2010-02-17 | Intel Corporation | Tamper resistant methods and apparatus |
JP2001516908A (en) * | 1997-09-05 | 2001-10-02 | インテル・コーポレーション | Method and apparatus for preventing unauthorized intrusion |
EP1018236A4 (en) * | 1997-09-05 | 2000-10-11 | Intel Corp | Tamper resistant methods and apparatus |
EP1020049A4 (en) * | 1997-09-05 | 2000-10-11 | Intel Corp | Tamper resistant methods and apparatus |
EP2131524A2 (en) | 1997-09-05 | 2009-12-09 | Intel Corporation | Tamper resistant methods and apparatus |
EP1020049A1 (en) * | 1997-09-05 | 2000-07-19 | Intel Corporation | Tamper resistant methods and apparatus |
EP1010291A1 (en) * | 1997-09-05 | 2000-06-21 | Intel Corporation | A tamper resistant player for scrambled contents |
EP1018236A1 (en) * | 1997-09-05 | 2000-07-12 | Intel Corporation | Tamper resistant methods and apparatus |
DE19847677C2 (en) * | 1997-10-15 | 2003-08-14 | Dell Usa Lp | Computers, methods and devices for preventing unauthorized access to a computer program |
US6237137B1 (en) | 1997-10-15 | 2001-05-22 | Dell Usa, L.P. | Method and system for preventing unauthorized access to a computer program |
US6334189B1 (en) | 1997-12-05 | 2001-12-25 | Jamama, Llc | Use of pseudocode to protect software from unauthorized use |
US6643775B1 (en) | 1997-12-05 | 2003-11-04 | Jamama, Llc | Use of code obfuscation to inhibit generation of non-use-restricted versions of copy protected software applications |
US6480959B1 (en) | 1997-12-05 | 2002-11-12 | Jamama, Llc | Software system and associated methods for controlling the use of computer programs |
US7383569B1 (en) | 1998-03-02 | 2008-06-03 | Computer Associates Think, Inc. | Method and agent for the protection against the unauthorized use of computer resources |
EP0949574A2 (en) * | 1998-03-25 | 1999-10-13 | Arachnid, Incorporated | Customizable multimedia segment structures |
US6191780B1 (en) | 1998-03-25 | 2001-02-20 | Arachnid, Inc. | Customizable multimedia segment structures |
EP0949574A3 (en) * | 1998-03-25 | 2000-03-01 | Arachnid, Incorporated | Customizable multimedia segment structures |
WO1999056196A1 (en) * | 1998-04-30 | 1999-11-04 | Bindview Development Corporation | Computer security |
US7140005B2 (en) * | 1998-12-21 | 2006-11-21 | Intel Corporation | Method and apparatus to test an instruction sequence |
US6256737B1 (en) | 1999-03-09 | 2001-07-03 | Bionetrix Systems Corporation | System, method and computer program product for allowing access to enterprise resources using biometric devices |
US9398013B2 (en) | 1999-03-09 | 2016-07-19 | Citibank, N.A. | System, method and computer program product for an authentication management infrastructure |
WO2000065442A1 (en) * | 1999-04-23 | 2000-11-02 | Giesecke & Devrient Gmbh | Protection of the core part of a computer against external manipulation |
US6959391B1 (en) | 1999-04-23 | 2005-10-25 | Giesecke & Devrient Gmbh | Protection of the core part of computer against external manipulation |
WO2000065444A1 (en) * | 1999-04-28 | 2000-11-02 | Thomas Probert | Techniques for encoding information in computer code |
US6782478B1 (en) | 1999-04-28 | 2004-08-24 | Thomas Probert | Techniques for encoding information in computer code |
WO2000072112A3 (en) * | 1999-05-12 | 2001-04-05 | Fraunhofer Crcg Inc | Obfuscation of executable code |
US7421586B2 (en) | 1999-05-12 | 2008-09-02 | Fraunhofer Gesselschaft | Protecting mobile code against malicious hosts |
WO2000072112A2 (en) * | 1999-05-12 | 2000-11-30 | Fraunhofer Crcg, Inc. | Obfuscation of executable code |
US6842862B2 (en) | 1999-06-09 | 2005-01-11 | Cloakware Corporation | Tamper resistant software encoding |
US6594761B1 (en) | 1999-06-09 | 2003-07-15 | Cloakware Corporation | Tamper resistant software encoding |
WO2000077597A1 (en) * | 1999-06-09 | 2000-12-21 | Cloakware Corporation | Tamper resistant software encoding |
US7770016B2 (en) | 1999-07-29 | 2010-08-03 | Intertrust Technologies Corporation | Systems and methods for watermarking software and other media |
US6779114B1 (en) | 1999-08-19 | 2004-08-17 | Cloakware Corporation | Tamper resistant software-control flow encoding |
US9843447B1 (en) | 1999-09-09 | 2017-12-12 | Secure Axcess Llc | Authenticating electronic content |
US10355863B2 (en) | 1999-09-09 | 2019-07-16 | Secure Axcess Llc | System and method for authenticating electronic content |
US7240363B1 (en) * | 1999-10-06 | 2007-07-03 | Ellingson Robert E | System and method for thwarting identity theft and other identity misrepresentations |
US6728219B1 (en) | 1999-11-15 | 2004-04-27 | Networks Associates Technology, Inc. | Graphical user interface system and method for visually gauging network performance |
US6810017B1 (en) | 1999-11-15 | 2004-10-26 | Networks Associates Technology Inc. | Graphical user interface system and method for organized network analysis |
US9438633B1 (en) | 2000-03-23 | 2016-09-06 | Citibank, N.A. | System, method and computer program product for providing unified authentication services for online applications |
US9009798B2 (en) | 2000-03-23 | 2015-04-14 | Citibank, N.A. | System, method and computer program product for providing unified authentication services for online applications |
US7950048B2 (en) * | 2000-03-29 | 2011-05-24 | Microsoft Corporation | Methods and arrangements for limiting access to computer controlled functions and devices |
US7065652B1 (en) | 2000-06-21 | 2006-06-20 | Aladdin Knowledge Systems, Ltd. | System for obfuscating computer code upon disassembly |
WO2001099034A3 (en) * | 2000-06-21 | 2003-05-08 | Aladdin Knowledge Systems | System for obfuscating computer code to prevent disassembly |
WO2001099034A2 (en) * | 2000-06-21 | 2001-12-27 | Aladdin Knowledge Systems; Ltd. | System for obfuscating computer code to prevent disassembly |
US7260845B2 (en) * | 2001-01-09 | 2007-08-21 | Gabriel Kedma | Sensor for detecting and eliminating inter-process memory breaches in multitasking operating systems |
USRE43624E1 (en) * | 2001-01-09 | 2012-08-28 | Xiloprem Tre Limited Liability Company | Sensor for detecting and eliminating inter-process memory breaches in multitasking operating systems |
GB2391341A (en) * | 2002-07-31 | 2004-02-04 | Hewlett Packard Co | A method of validating the rights of a user to participate in an interactive computer environment |
GB2392276A (en) * | 2002-07-31 | 2004-02-25 | Hewlett Packard Development Co | A method of validating the rights of a user to participate in an interactive computer environment |
GB2392276B (en) * | 2002-07-31 | 2004-10-27 | Hewlett Packard Development Co | A method of validating performance of a participant in an interactive computing environment |
US7877613B2 (en) | 2002-09-04 | 2011-01-25 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Protecting mobile code against malicious hosts |
WO2007147495A3 (en) * | 2006-06-21 | 2008-08-28 | Wibu Systems Ag | Method and system for intrusion detection |
US8490191B2 (en) | 2006-06-21 | 2013-07-16 | Wibu-Systems Ag | Method and system for intrusion detection |
US8595809B2 (en) | 2008-04-09 | 2013-11-26 | Bank Of America Corporation | Reusable authentication experience tool |
US8136148B1 (en) | 2008-04-09 | 2012-03-13 | Bank Of America Corporation | Reusable authentication experience tool |
FR2986124A1 (en) * | 2012-01-25 | 2013-07-26 | Ercom Engineering Reseaux Comm | METHOD FOR AUTHENTICATING A DEVICE COMPRISING A PROCESSOR AND A CHIP CARD BY GENERATING A PATTERN |
WO2013110571A1 (en) * | 2012-01-25 | 2013-08-01 | Ercom Engineering Reseaux Communications | Method for authenticating a device including a processor and a smart card by pattern generation |
US10237073B2 (en) | 2015-01-19 | 2019-03-19 | InAuth, Inc. | Systems and methods for trusted path secure communication |
US10848317B2 (en) | 2015-01-19 | 2020-11-24 | InAuth, Inc. | Systems and methods for trusted path secure communication |
US11171790B2 (en) | 2015-01-19 | 2021-11-09 | Accertify, Inc. | Systems and methods for trusted path secure communication |
US11818274B1 (en) | 2015-01-19 | 2023-11-14 | Accertify, Inc. | Systems and methods for trusted path secure communication |
WO2017108977A1 (en) * | 2015-12-24 | 2017-06-29 | Gemalto Sa | Method and system for enhancing the security of a transaction |
EP3185194A1 (en) * | 2015-12-24 | 2017-06-28 | Gemalto Sa | Method and system for enhancing the security of a transaction |
US11157912B2 (en) | 2015-12-24 | 2021-10-26 | Thales Dis France Sa | Method and system for enhancing the security of a transaction |
US11880832B2 (en) | 2015-12-24 | 2024-01-23 | Thales Dis France Sas | Method and system for enhancing the security of a transaction |
CN110162937A (en) * | 2018-02-09 | 2019-08-23 | 黄冈职业技术学院 | The method for realizing protecting computer software based on network communication |
CN110162937B (en) * | 2018-02-09 | 2024-02-02 | 黄冈职业技术学院 | Method for realizing computer software protection based on network communication |
CN113343234A (en) * | 2021-06-10 | 2021-09-03 | 支付宝(杭州)信息技术有限公司 | Method and device for carrying out credible check on code security |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6006328A (en) | Computer software authentication, protection, and security system | |
WO1997004394A1 (en) | Computer software authentication, protection, and security system | |
US8261359B2 (en) | Systems and methods for preventing unauthorized use of digital content | |
Naumovich et al. | Preventing piracy, reverse engineering, and tampering | |
AU2002305490B2 (en) | Systems and methods for the prevention of unauthorized use and manipulation of digital content | |
Nagra et al. | Surreptitious software: obfuscation, watermarking, and tamperproofing for software protection | |
US5935246A (en) | Electronic copy protection mechanism using challenge and response to prevent unauthorized execution of software | |
EP2267626B1 (en) | Digital rights management system and method | |
CA2783822C (en) | Steganographic messaging system using code invariants | |
AU2002305490A1 (en) | Systems and methods for the prevention of unauthorized use and manipulation of digital content | |
CN103856481B (en) | The code protection method and system performed using on-line authentication and encrypted code | |
JPH08166879A (en) | Method and apparatus for reinforcement of safety of softwarefor distribution | |
Jo et al. | Security analysis and improvement of fingerprint authentication for smartphones | |
US20050091516A1 (en) | Secure attention instruction central processing unit and system architecture | |
AU725098B2 (en) | Computer software authentication, protection, and security system | |
Spalka et al. | Trojan horse attacks on software for electronic signatures | |
AU2002219852B2 (en) | Systems and methods for preventing unauthorized use of digital content | |
AU2002219852A1 (en) | Systems and methods for preventing unauthorized use of digital content | |
EP1637959A2 (en) | Systems and methods for preventing unauthorized use of digital content | |
AU2010202883B2 (en) | Systems and Methods for Preventing Unauthorized Use of Digital Content | |
AU2008200472A1 (en) | Systems and methods for preventing unauthorized use of digital content related applications | |
LaVenture et al. | Software and the virus threat: providing authenticity in distribution | |
FR2767240A1 (en) | Electronic signature system for use with computer transmitter documents |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): JP |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
122 | Ep: pct application non-entry in european phase |