-
Notifications
You must be signed in to change notification settings - Fork 18.3k
Open
Labels
NeedsInvestigationSomeone must examine and confirm this is a valid issue and not a duplicate of an existing one.Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.Performance
Milestone
Description
What version of Go are you using (go version
)?
1.9
Does this issue reproduce with the latest release?
True
What operating system and processor architecture are you using (go env
)?
Windows
What did you do?
I trying to parse large files with SAX with go, and get decadent performance.
I rewrite code in C#, and get maximum performance.
file, err := os.Open(filename)
handle(err)
defer file.Close()
buffer := bufio.NewReaderSize(file, 1024*1024*256) // 33554432
decoder := xml.NewDecoder(buffer)
for {
t, _ := decoder.Token()
if t == nil {
break
}
switch se := t.(type) {
case xml.StartElement:
if se.Name.Local == "House" {
house := House{}
err := decoder.DecodeElement(&house, &se)
handle(err)
}
}
}
using (XmlReader reader = XmlReader.Create(filename)
{
while (reader.Read())
{
switch (reader.NodeType)
{
case XmlNodeType.Element:
if (reader.Name == "House")
{
//Code
}
break;
}
}
}
What did you expect to see?
Mature and fast xml parser in golang.
What did you see instead?
The bottleneck in SAX xml parsing with go is CPU, instead of low HDD io performance.
typeless, ASPecherkin, will14smith, JMontagu, SebScoFr and 33 morewebern, rickiedu, igtm, kordianbruck and BenderVano
Metadata
Metadata
Assignees
Labels
NeedsInvestigationSomeone must examine and confirm this is a valid issue and not a duplicate of an existing one.Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.Performance