No relevant resource is found in the selected language.

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Read our privacy policy>


To have a better experience, please upgrade your IE browser.


The interface usage statistic difference between display interface brief command and display interface command

Publication Date:  2014-11-12 Views:  34 Downloads:  0
Issue Description

In the NE40E&80E,there are two commands can konw about the interface usage average in the last 300 seconds.
1.display interface brief [ main ]
2.display interface [ interface-type [ interface-number | main ] | slot slot-id [ main ] ]
The first one indicates the average inbound and outbound bandwidth usage of this interface in the last 300 seconds. The second one rate at which bits and packets are received and sent by the interface in the last 300 seconds.
As follows:
1.Use the display interface brief command:
Interface                    PHY  Protocol  InUti  OutUti  inErrors    outErrors
GigabitEthernet3/0/18  up   down         0%    100%         0           33
2.Use the display interface GigabitEthernet3/0/18 command:
Statistics last cleared:never  
Last 300 seconds input rate: 0 bits/sec, 0 packets/sec
Last 300 seconds output rate: 834561632 bits/sec, 1088372 packets/sec
In the first one the OutUti is already 100 % but in the second one the output rate is 834561632 bits/sec.

Handling Process

When use the display interface [ interface-type [ interface-number | main ] | slot slot-id [ main ] ] command,the "input/output rate" statistics is not contains Inter-Packet Gap(12byte) and Preamble (8 bytes).
When use the display interface brief [ main ] command,the "InUti/OutUti" is contains the 20byte cost(Inter-Packet Gap and Preamble) every packet.
Inter-Packet Gap(IPG):
A delay or time gap between CSMA/CD packets intended to provide interframe recovery time for other CSMA/CD sublayers and for the Physical Medium.For example, for 10BASE-T, the IPG is 9.6 μs (96 bit times); for 100BASE-T, the IPG is 0.96 μs (96 bit times).
A sequence of binary characters recorded at the beginning of each block of data, on a phase-encoded magnetic tape, for the purpose of synchronization when reading forward.

Root Cause