Full metadata record
DC FieldValueLanguage
dc.contributor.authorLi, Y.en_US
dc.date.accessioned2014-12-08T15:24:43Z-
dc.date.available2014-12-08T15:24:43Z-
dc.date.issued2006en_US
dc.identifier.isbn3-540-32861-0en_US
dc.identifier.urihttp://hdl.handle.net/11536/17165-
dc.description.abstractIn this paper, a hybrid intelligent computational methodology is presented for the parameter extraction of compact models. This solution technique integrates the genetic algorithm (GA), the neural network (NN), and the Levenberg-Marquardt (LM) method for current-voltage (I-V) curves characterization, optimization, and parameter extraction of deep-submicron metal-oxide-semiconductor field effect transistors (MOSFETs). For a specified compact model, this unified optimization technique extracts a set of corresponding parameters with respect to measured data. The GA is performed to search solutions according to the feedback of the NN, where the LM solves a local optimization problem with the input of the GA. The well-known BSIM and EKV compact models of MOSFETs have been studied and implemented for automatic parameters extraction. In terms of accuracy and convergence of score, the proposed optimization technique is computationally verified to show its advantages for parameter extraction of MOSFETs. Comparisons among pure GA approach, solution with GA and NN, solution with GA and LM, and the proposed method are also discussed.en_US
dc.language.isoen_USen_US
dc.titleA hybrid intelligent computational methodology for semiconductor device equivalent circuit model parameter extractionen_US
dc.typeProceedings Paperen_US
dc.identifier.journalSCIENTIFIC COMPUTING IN ELECTRICAL ENGINEERINGen_US
dc.citation.volume9en_US
dc.citation.spage345en_US
dc.citation.epage350en_US
dc.contributor.department電信工程研究所zh_TW
dc.contributor.departmentInstitute of Communications Engineeringen_US
dc.identifier.wosnumberWOS:000241665300049-
Appears in Collections:Conferences Paper