SVSI- 2017
organized by : CSE Department, MCKVIE
sponsored by: CSI-MCKVIE Student Chapter and MCKV
Institute of Engineering
TiMerS:Time-Based Music Recommender System
Gobinda Karmakar, Kehkasha Naushad,Sudipta
Chakrabarty
Dept. of MCA,Techno India, Salt Lake, EM 4/1,
Sector –V, Kolkata – 700091, India
Introduction
Music recommendation system enhances personalized music
classifications that create a profile with the service and build up a music
library based on the choice preferences using mobile cloud services.
Music Recommendation System estimates the user’s music preference
from a localized music collection and then generates a set of songs based on
these estimates from a wider music collection.
In this paper we proposed a method that generates the song library
depends on the different time slots of a day.
Introduction Contd…
ProposedWork
Time slot based music recommender is very effective for mobile music
listeners.
The primary objective of this work is to find out the playing time of any
song from listener’s music library and create time slot based different music
libraries from generating Raga name of that particular song and then
matching with the raga time playing database.
ProposedWork Contd…
To find the raga name of music, it requires that the music run throughWave
Surfer software which is used in this experiment to get the pitch value of
that song.
WaveSurfer is an open source tool for sound visualization and
manipulation. WaveSurfer is free audio editing software widely used for
interactive display of sound waveforms, spectral sections, spectrograms,
pitch contour, and pitch transcriptions.
WaveSurfer can read and write a number of pitch transcription file formats
used in speech processing research. ‘.wav’ music file is appropriate to create
the pitch file of the music.
ProposedWork Contd…
Keeping sample encoding at line 16 and sample rate at 22050 the pitch
contour of the music is generated. This will give all the pitches that are used
in the song and the pitch data are saved in the .f0 format. It consists of huge
number of frequencies of monotonic song and we convert this .f0 into .xls
format.
Then analyse the FFT (Fast Fourier Transform) of the fo file values. An FFT
of a time domain signal takes the samples and gives the frequencies,
amplitudes, and phases of the sine waves that make up the sound wave
analyzed. In this case, the sample rate is 44.1 kHz and the FFT size is 1024,
so the bin width is the Nyquist frequency (44,100/2 = 22,050) divided by the
FFT size or near about 22 Hz.
•
ProposedWork Contd….
Accepted only those frequency values within 50 to 500.
Then the number of occurrences has been calculated of each frequency
ranging from 50 to 500.
The frequency that contains highest occurrences is the most important
note and got the name of the most important note by matching with their
fundamental frequency range. The fundamental frequency range is given in
the Table 1. After calculating the most important Note has been found out
the other 11 notes by using the Note Frequency ratio table and also finds out
the Octave.The Note Frequency Ratio is also given inTable 1.
ProposedWork Contd….
Western Notes Indian
Notes
Frequency
Range
Note Frequency
Ratio
A (Dha) 215.5 – 226.5 1.000
A# (ni) 226.6 – 239.9 1.054
B (Ni) 240.0 – 252.5 1.125
C (Sa) 252.6 – 269.5 1.186
C# (re) 269.6 – 285.3 1.253
D (Re) 285.4 – 302.3 1.333
D# (ga) 302.4 – 320.3 1.415
E (Ga) 320.4 – 339.3 1.500
F (ma) 339.4 – 359.4 1.580
F# (MA) 359.5 – 380.9 1.691
G (Pa) 381.0 – 403.5 1.777
G# (dha) 403.6 – 427.5 1.893
Table 1. Frequency Range and Note Frequency Ratio of different Notes
ProposedWork Contd….
Then matching the frequency values of all the 12 notes from the Excel
Sheet and if the notes are in the Excel Sheet then takes the value as one
note. From each of the range of frequencies of notes, the most frequently
occurring frequencies with significant amplitude among the observed
frequencies are identified.
In all there will be 12 such frequencies and from among these 12 the most
prominent 7 frequencies are identified.
We have a Raga Knowledge base that consists of 7 note structures of each
Raga and we match that particular 7 note structures of our input song with
the knowledge base and if found then print the name of the Raga, otherwise
print Mismatch.
ProposedWork Contd….
 After finding the raga name of that music, it requires to match with the
raga playing time database and saving the music in preference time folders
and create personalized time slot based music recommendation.
Work Flow
Result Set Analysis
Test Case:
SongTitle: “Aji rooth kar”
Playback Singer: Lata Mangeshkar
Film Name: Arzoo (1965)
Music Director: Shankar Jaikishan
Lyrics: Hasrat Jaipuri
Cast: Rajendra Kumar, Sadhana Shivdasani, Feroz Khan, Nazima, Nasir
Hussain,
Film Director: Ramanand Sagar
Result Set Analysis Contd….
Firstly .wav file is used to create the pitch file of the song. Keeping sample
encoding at line 16 and sample rate at 22050 the pitch contour of song is
generated.
All the pitches that are used in the song and the pitch data are saved in the
excel sheet saved in .f0 format. Convert this .f0 file into .xls file that shown in
Table 2.
Result Set Analysis Contd….
Notes Number of
Occurrences
Frequency Range
A/Dha 1725 215.5 – 226.5
A#/Ni 4205 226.6 – 239.9
B/Ni 1770 240.0 – 252.5
C/Sa 2241 252.6 – 269.5
C#/re 0002 269.6 – 285.3
D/Re 4646 285.4 – 302.3
D/#ga 2836 302.4 – 320.3
E/Ga 1017 320.4 – 339.3
F/ma 2319 339.4 – 359.4
F#/MA 2187 359.5 – 380.9
G/Pa 4512 381.0 – 403.5
G#/dha 2993 403.6 – 427.5
Table 2. Note Occurrences of the song “Aji rooth kar”
Result Set Analysis Contd….
From Table 2 the song title “Aji rooth kar” has only 2 occurrences has been
admired of C# and that is negligible and other 11 notes has very high
occurrences. Therefore C# is absent in this song. The highest occurrence of
frequency value 4646 and it belongs to the fundamental frequency range of
“D/Re” note.
Therefore D or Re is the most important note (Vadi Swar). The second
highest occurrence of frequency value 4512 and it belongs to the
fundamental frequency range of “G/Pa” note. Consequently G or Pa is the
Second most important note (Samvadi Swar). Then we find out all the other
notes using note-frequency ratio Table 3. and mapping with the values of
.xls file and got the other notes as Sa/C, ma/F, Pa/G, Ni/B, ni/A#, Dha/A,
Pa/G, ma/F, Ga/E, Re/D, Sa/C.
Result Set Analysis Contd….
Other Informations:
Most Important Note orVadi: D/Re
Second Most Important Note or Sambadi: G/Pa
Ascent or Aroho: Sa/C, Re/D, ma/F, Pa/G, Ni/B
Decent or Aboroho: ni/A#, Dha/A, Pa/G, ma/F, Ga/E, Re/D, Sa/C
Raga Name: Desh
Species or Jati: Pentatonic – Heptatonic (5 – 7)
Conclusion
This work describes time slot based music recommendation system; a
technique that allows listeners to personalized music depends on the
appropriate raga playing time using raga time knowledgebase.
This article has been implemented as a step towards developing a method
to assist and evaluate the Raga recognition of Indian Music and also identify
the playing time of that raga.
In this contribution, there are eighty Indian song music have been tested
and find out the raga name and raga playing time and create appropriate
mobile music recommendation system using different time slots of a day on
mobile cloud.
References
Sudipta Chakrabarty, Md Ruhul Islam, Debashis De, “Modelling of Song Pattern Similarity
using Coefficient of Variance” , International Journal of Computer Science and Information
Sequrity (ISSN 1947-5500), pp. 388-394, 2017.
Samarjit Roy, Sudipta Chakrabarty, Debashis De, “Time-Based Raga Recommendation and
Information Retrieval of Musical Patterns in Indian Classical Music using Neural Network”,
IAES International Journal of Artificial Intelligence (IJ-AI) (ISSN: 2252-8938), pp. 33-48, 2017.
Sudipta Chakrabarty, Samarjit Roy, Debashis De, Edited by Siddhartha Bhattacharyya,
Hrishikesh Bhaumik, Sourav De, Goran Klepac, Handbook of Research on Intelligent
Analysis of Multimedia information (Hardcover),Chapter 12: Time-Slot Based Intelligent
Music Recommender in Indian Music, ISBN13: 9781522504986, ISBN10: 1522504982,
Publisher: IGI Global, USA, 2016.
References Contd….
Sudipta Chakrabarty, Samarjit Roy, debashis De, “A Foremost Survey on State-Of-The-Art
Computational Music Research”, In Proceedings of the Recent Trends in Computations and
Mathematical Analysis in Engineering and Sciences”, International Science Congress
Association, pp. 16-25, 2015.
Sudipta Chakrabarty, Samarjit Roy, debashis De, “Behavioural Modelling of Ragas of indian
Classical Music using Unified Modelling Language”, In Proceedings of the 2nd International
Conference on Perception and Machine Intelligence, ACM, pp. 151-160, 2015.
Sudipta Chakrabarty, Samarjit Roy, Debashis De, “Automatic Raga Recognition using
Fundamental Frequency Range of Extracted Musical Notes” Accepted In: Eight
International MultiConference on Image and Signal Processing (ICISP 2014), Elsevier, pp.
337-345, 2014.
Sudipta Chakrabarty, Samarjit Roy, Debashis De, "Pervasive Diary in Music Rhythm
Education: A Context-Aware Learning Tool Using Genetic Algorithm." Advanced
Computing, Networking and Informatics-Volume 1. Springer International Publishing, pp.
669-677, 2014.
References Contd….
Samarjit Roy, Sudipta Chakrabarty, Pradipta Bhakta, Debashis De, “Modelling High
Performing Music Computing using Petri Nets,” Accepted In: International Conference on
Control, Instrumentation, Energy and Communication (CIEC), IEEE, pp. 678-682, 2014.
Samarjit Roy, Sudipta Chakrabarty, Debashis De, "A Framework of Musical Pattern
Recognition Using Petri Nets." In the Proceedings of Emerging Trends in Computing and
Communication. Springer India, pp. 245-252, 2014.
Sudipta Chakrabarty, Debashis De, “Quality Measure Model of Music Rhythm using
Genetic Algorithm”, In Proceedings of International Conference on Radar, Communication
and Computing (ICRCC), IEEE, pp. 125-130, 2012.
Thank You….
Any ??

More Related Content

PPTX
Modeling of Song Pattern Similarity using Coefficient of Variance
PDF
Mining Melodic Patterns in Large Audio Collections of Indian Art Music
PPTX
A system to generate rhythms automatically for songs in rhythm game
PDF
50120130406018
PDF
MusicData for basis of data science in context.pdf
PPTX
Intelligent error detection and advisory system for practitioners of music us...
PPTX
Discovery and Characterization of Melodic Motives in Large Audio Music Collec...
PDF
Computational Approaches for Melodic Description in Indian Art Music Corpora
Modeling of Song Pattern Similarity using Coefficient of Variance
Mining Melodic Patterns in Large Audio Collections of Indian Art Music
A system to generate rhythms automatically for songs in rhythm game
50120130406018
MusicData for basis of data science in context.pdf
Intelligent error detection and advisory system for practitioners of music us...
Discovery and Characterization of Melodic Motives in Large Audio Music Collec...
Computational Approaches for Melodic Description in Indian Art Music Corpora

Similar to "TiMerS: Time-Based Music Recommender System" (20)

PDF
Music analyzer and plagiarism
PPTX
Fundamentals of music processing chapter 5 발표자료
PDF
Collins
PDF
Computational Approaches to Melodic Analysis of Indian Art Music
PDF
IRJET- Survey on Musical Scale Identification
PDF
IRJET- A Review of Music Analysis Techniques
PDF
Content based indexing of music
PDF
Knn a machine learning approach to recognize a musical instrument
PDF
Landmark Detection in Hindustani Music Melodies
PDF
Phrase-based Rāga Recognition Using Vector Space Modeling
PDF
PDF
Introduction of my research histroy: From instrument recognition to support o...
PPTX
Automatic Set List Identification and Song Segmentation of Full-Length Concer...
PPTX
PDF
Making machines that make music
PPTX
Graphical visualization of musical emotions
PPT
Identifying Successful Melodic Similarity Algorithms for use in Music
PPT
Raga Identification In Carnatic Music
PDF
IRJET- Music Genre Classification using GMM
PDF
Music Recommendation System
Music analyzer and plagiarism
Fundamentals of music processing chapter 5 발표자료
Collins
Computational Approaches to Melodic Analysis of Indian Art Music
IRJET- Survey on Musical Scale Identification
IRJET- A Review of Music Analysis Techniques
Content based indexing of music
Knn a machine learning approach to recognize a musical instrument
Landmark Detection in Hindustani Music Melodies
Phrase-based Rāga Recognition Using Vector Space Modeling
Introduction of my research histroy: From instrument recognition to support o...
Automatic Set List Identification and Song Segmentation of Full-Length Concer...
Making machines that make music
Graphical visualization of musical emotions
Identifying Successful Melodic Similarity Algorithms for use in Music
Raga Identification In Carnatic Music
IRJET- Music Genre Classification using GMM
Music Recommendation System
Ad

Recently uploaded (20)

PPT
Comm.-100W-Writing-a-Convincing-Editorial-slides.ppt
PDF
Unnecessary information is required for the
DOC
EVC毕业证学历认证,北密歇根大学毕业证留学硕士毕业证
DOCX
CLASS XII bbbbbnjhcvfyfhfyfyhPROJECT.docx
PPTX
INDIGENOUS-LANGUAGES-AND-LITERATURE.pptx
PPTX
Unit 8#Concept of teaching and learning.pptx
PPTX
CASEWORK Pointers presentation Field instruction I
PPTX
Ulangan Harian_TEOREMA PYTHAGORAS_8.pptx
PPTX
CAPE CARIBBEAN STUDIES- Integration-1.pptx
PPTX
HOW TO HANDLE THE STAGE FOR ACADEMIA AND OTHERS.pptx
PDF
_Nature and dynamics of communities and community development .pdf
PDF
Public speaking for kids in India - LearnifyU
PPTX
Bob Difficult Questions 08 17 2025.pptx
PPTX
Phrases and phrasal verb for a small step.
PPTX
Paraphrasing Sentence To Make Your Writing More Interesting
PDF
Financial Managememt CA1 for Makaut Student
PPTX
Phylogeny and disease transmission of Dipteran Fly (ppt).pptx
PDF
Presentation on cloud computing and ppt..
PPTX
Rakhi Presentation vbbrfferregergrgerg.pptx
PPTX
Lesson 1 (Digital Media) - Multimedia.pptx
Comm.-100W-Writing-a-Convincing-Editorial-slides.ppt
Unnecessary information is required for the
EVC毕业证学历认证,北密歇根大学毕业证留学硕士毕业证
CLASS XII bbbbbnjhcvfyfhfyfyhPROJECT.docx
INDIGENOUS-LANGUAGES-AND-LITERATURE.pptx
Unit 8#Concept of teaching and learning.pptx
CASEWORK Pointers presentation Field instruction I
Ulangan Harian_TEOREMA PYTHAGORAS_8.pptx
CAPE CARIBBEAN STUDIES- Integration-1.pptx
HOW TO HANDLE THE STAGE FOR ACADEMIA AND OTHERS.pptx
_Nature and dynamics of communities and community development .pdf
Public speaking for kids in India - LearnifyU
Bob Difficult Questions 08 17 2025.pptx
Phrases and phrasal verb for a small step.
Paraphrasing Sentence To Make Your Writing More Interesting
Financial Managememt CA1 for Makaut Student
Phylogeny and disease transmission of Dipteran Fly (ppt).pptx
Presentation on cloud computing and ppt..
Rakhi Presentation vbbrfferregergrgerg.pptx
Lesson 1 (Digital Media) - Multimedia.pptx
Ad

"TiMerS: Time-Based Music Recommender System"

  • 1. SVSI- 2017 organized by : CSE Department, MCKVIE sponsored by: CSI-MCKVIE Student Chapter and MCKV Institute of Engineering TiMerS:Time-Based Music Recommender System Gobinda Karmakar, Kehkasha Naushad,Sudipta Chakrabarty Dept. of MCA,Techno India, Salt Lake, EM 4/1, Sector –V, Kolkata – 700091, India
  • 2. Introduction Music recommendation system enhances personalized music classifications that create a profile with the service and build up a music library based on the choice preferences using mobile cloud services. Music Recommendation System estimates the user’s music preference from a localized music collection and then generates a set of songs based on these estimates from a wider music collection. In this paper we proposed a method that generates the song library depends on the different time slots of a day.
  • 4. ProposedWork Time slot based music recommender is very effective for mobile music listeners. The primary objective of this work is to find out the playing time of any song from listener’s music library and create time slot based different music libraries from generating Raga name of that particular song and then matching with the raga time playing database.
  • 5. ProposedWork Contd… To find the raga name of music, it requires that the music run throughWave Surfer software which is used in this experiment to get the pitch value of that song. WaveSurfer is an open source tool for sound visualization and manipulation. WaveSurfer is free audio editing software widely used for interactive display of sound waveforms, spectral sections, spectrograms, pitch contour, and pitch transcriptions. WaveSurfer can read and write a number of pitch transcription file formats used in speech processing research. ‘.wav’ music file is appropriate to create the pitch file of the music.
  • 6. ProposedWork Contd… Keeping sample encoding at line 16 and sample rate at 22050 the pitch contour of the music is generated. This will give all the pitches that are used in the song and the pitch data are saved in the .f0 format. It consists of huge number of frequencies of monotonic song and we convert this .f0 into .xls format. Then analyse the FFT (Fast Fourier Transform) of the fo file values. An FFT of a time domain signal takes the samples and gives the frequencies, amplitudes, and phases of the sine waves that make up the sound wave analyzed. In this case, the sample rate is 44.1 kHz and the FFT size is 1024, so the bin width is the Nyquist frequency (44,100/2 = 22,050) divided by the FFT size or near about 22 Hz. •
  • 7. ProposedWork Contd…. Accepted only those frequency values within 50 to 500. Then the number of occurrences has been calculated of each frequency ranging from 50 to 500. The frequency that contains highest occurrences is the most important note and got the name of the most important note by matching with their fundamental frequency range. The fundamental frequency range is given in the Table 1. After calculating the most important Note has been found out the other 11 notes by using the Note Frequency ratio table and also finds out the Octave.The Note Frequency Ratio is also given inTable 1.
  • 8. ProposedWork Contd…. Western Notes Indian Notes Frequency Range Note Frequency Ratio A (Dha) 215.5 – 226.5 1.000 A# (ni) 226.6 – 239.9 1.054 B (Ni) 240.0 – 252.5 1.125 C (Sa) 252.6 – 269.5 1.186 C# (re) 269.6 – 285.3 1.253 D (Re) 285.4 – 302.3 1.333 D# (ga) 302.4 – 320.3 1.415 E (Ga) 320.4 – 339.3 1.500 F (ma) 339.4 – 359.4 1.580 F# (MA) 359.5 – 380.9 1.691 G (Pa) 381.0 – 403.5 1.777 G# (dha) 403.6 – 427.5 1.893 Table 1. Frequency Range and Note Frequency Ratio of different Notes
  • 9. ProposedWork Contd…. Then matching the frequency values of all the 12 notes from the Excel Sheet and if the notes are in the Excel Sheet then takes the value as one note. From each of the range of frequencies of notes, the most frequently occurring frequencies with significant amplitude among the observed frequencies are identified. In all there will be 12 such frequencies and from among these 12 the most prominent 7 frequencies are identified. We have a Raga Knowledge base that consists of 7 note structures of each Raga and we match that particular 7 note structures of our input song with the knowledge base and if found then print the name of the Raga, otherwise print Mismatch.
  • 10. ProposedWork Contd….  After finding the raga name of that music, it requires to match with the raga playing time database and saving the music in preference time folders and create personalized time slot based music recommendation.
  • 12. Result Set Analysis Test Case: SongTitle: “Aji rooth kar” Playback Singer: Lata Mangeshkar Film Name: Arzoo (1965) Music Director: Shankar Jaikishan Lyrics: Hasrat Jaipuri Cast: Rajendra Kumar, Sadhana Shivdasani, Feroz Khan, Nazima, Nasir Hussain, Film Director: Ramanand Sagar
  • 13. Result Set Analysis Contd…. Firstly .wav file is used to create the pitch file of the song. Keeping sample encoding at line 16 and sample rate at 22050 the pitch contour of song is generated. All the pitches that are used in the song and the pitch data are saved in the excel sheet saved in .f0 format. Convert this .f0 file into .xls file that shown in Table 2.
  • 14. Result Set Analysis Contd…. Notes Number of Occurrences Frequency Range A/Dha 1725 215.5 – 226.5 A#/Ni 4205 226.6 – 239.9 B/Ni 1770 240.0 – 252.5 C/Sa 2241 252.6 – 269.5 C#/re 0002 269.6 – 285.3 D/Re 4646 285.4 – 302.3 D/#ga 2836 302.4 – 320.3 E/Ga 1017 320.4 – 339.3 F/ma 2319 339.4 – 359.4 F#/MA 2187 359.5 – 380.9 G/Pa 4512 381.0 – 403.5 G#/dha 2993 403.6 – 427.5 Table 2. Note Occurrences of the song “Aji rooth kar”
  • 15. Result Set Analysis Contd…. From Table 2 the song title “Aji rooth kar” has only 2 occurrences has been admired of C# and that is negligible and other 11 notes has very high occurrences. Therefore C# is absent in this song. The highest occurrence of frequency value 4646 and it belongs to the fundamental frequency range of “D/Re” note. Therefore D or Re is the most important note (Vadi Swar). The second highest occurrence of frequency value 4512 and it belongs to the fundamental frequency range of “G/Pa” note. Consequently G or Pa is the Second most important note (Samvadi Swar). Then we find out all the other notes using note-frequency ratio Table 3. and mapping with the values of .xls file and got the other notes as Sa/C, ma/F, Pa/G, Ni/B, ni/A#, Dha/A, Pa/G, ma/F, Ga/E, Re/D, Sa/C.
  • 16. Result Set Analysis Contd…. Other Informations: Most Important Note orVadi: D/Re Second Most Important Note or Sambadi: G/Pa Ascent or Aroho: Sa/C, Re/D, ma/F, Pa/G, Ni/B Decent or Aboroho: ni/A#, Dha/A, Pa/G, ma/F, Ga/E, Re/D, Sa/C Raga Name: Desh Species or Jati: Pentatonic – Heptatonic (5 – 7)
  • 17. Conclusion This work describes time slot based music recommendation system; a technique that allows listeners to personalized music depends on the appropriate raga playing time using raga time knowledgebase. This article has been implemented as a step towards developing a method to assist and evaluate the Raga recognition of Indian Music and also identify the playing time of that raga. In this contribution, there are eighty Indian song music have been tested and find out the raga name and raga playing time and create appropriate mobile music recommendation system using different time slots of a day on mobile cloud.
  • 18. References Sudipta Chakrabarty, Md Ruhul Islam, Debashis De, “Modelling of Song Pattern Similarity using Coefficient of Variance” , International Journal of Computer Science and Information Sequrity (ISSN 1947-5500), pp. 388-394, 2017. Samarjit Roy, Sudipta Chakrabarty, Debashis De, “Time-Based Raga Recommendation and Information Retrieval of Musical Patterns in Indian Classical Music using Neural Network”, IAES International Journal of Artificial Intelligence (IJ-AI) (ISSN: 2252-8938), pp. 33-48, 2017. Sudipta Chakrabarty, Samarjit Roy, Debashis De, Edited by Siddhartha Bhattacharyya, Hrishikesh Bhaumik, Sourav De, Goran Klepac, Handbook of Research on Intelligent Analysis of Multimedia information (Hardcover),Chapter 12: Time-Slot Based Intelligent Music Recommender in Indian Music, ISBN13: 9781522504986, ISBN10: 1522504982, Publisher: IGI Global, USA, 2016.
  • 19. References Contd…. Sudipta Chakrabarty, Samarjit Roy, debashis De, “A Foremost Survey on State-Of-The-Art Computational Music Research”, In Proceedings of the Recent Trends in Computations and Mathematical Analysis in Engineering and Sciences”, International Science Congress Association, pp. 16-25, 2015. Sudipta Chakrabarty, Samarjit Roy, debashis De, “Behavioural Modelling of Ragas of indian Classical Music using Unified Modelling Language”, In Proceedings of the 2nd International Conference on Perception and Machine Intelligence, ACM, pp. 151-160, 2015. Sudipta Chakrabarty, Samarjit Roy, Debashis De, “Automatic Raga Recognition using Fundamental Frequency Range of Extracted Musical Notes” Accepted In: Eight International MultiConference on Image and Signal Processing (ICISP 2014), Elsevier, pp. 337-345, 2014. Sudipta Chakrabarty, Samarjit Roy, Debashis De, "Pervasive Diary in Music Rhythm Education: A Context-Aware Learning Tool Using Genetic Algorithm." Advanced Computing, Networking and Informatics-Volume 1. Springer International Publishing, pp. 669-677, 2014.
  • 20. References Contd…. Samarjit Roy, Sudipta Chakrabarty, Pradipta Bhakta, Debashis De, “Modelling High Performing Music Computing using Petri Nets,” Accepted In: International Conference on Control, Instrumentation, Energy and Communication (CIEC), IEEE, pp. 678-682, 2014. Samarjit Roy, Sudipta Chakrabarty, Debashis De, "A Framework of Musical Pattern Recognition Using Petri Nets." In the Proceedings of Emerging Trends in Computing and Communication. Springer India, pp. 245-252, 2014. Sudipta Chakrabarty, Debashis De, “Quality Measure Model of Music Rhythm using Genetic Algorithm”, In Proceedings of International Conference on Radar, Communication and Computing (ICRCC), IEEE, pp. 125-130, 2012.